Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • 首頁
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
  • 分類瀏覽
    • 研究成果檢索
    • 研究人員
    • 單位
    • 計畫
  • 機構典藏
  • SDGs
  • 登入
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://scholars.ntou.edu.tw/handle/123456789/26433
標題: Real-Time Defect Detection for Fast-Moving Fabrics on Circular Knitting Machine Under Various Illumination Conditions
作者: Ni, Yan-Qin
Huang, Pei-Kai
Yang, Ching-Han
Chang, Chin-Chun 
Wang, Wei-Jen
Liang, Deron
關鍵字: Fabrics;Real-time systems;Lighting;Defect detection;Accuracy;Production;Semantic segmentation;Feature extraction;Training;Prototypes;Fabric defect detection;local binary convolution;circular knitting;real-time detection;few-s
公開日期: 2025
出版社: IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
卷: 13
起(迄)頁: 139890-139903
來源出版物: IEEE ACCESS
摘要: 
In industrial production, automated inspection methods for circular knitting machines often encounter several challenges. First, the rapid movement of fabrics on these machines makes it difficult for existing fabric defect detection methods to effectively capture and process the motion. Next, due to practical constraints aimed at maintaining high yield rates, collecting sufficient abnormal fabric samples for model training is costly and limited. Furthermore, circular knitting machines typically operate under varying illumination conditions, further complicating the task of accurate fabric defect detection. Additionally, these methods usually fail to identify the cutline patterns that are integral to the design of the fabric and mistake cutlines for v-line defects. Therefore, existing fabric defect detection methods often struggle to balance real-time processing, few-shot learning, and high accuracy under various illumination conditions To address the aforementioned challenges, we adopt a few-shot learning approach and propose a novel real-time fabric defect detection method for circular knitting machines, aiming to achieve high accuracy even under varying illumination conditions. The proposed mechanism consists of two components, the LBUnet and the false alarm filter for cutlines. First, to tackle the challenges of real-time detection, limited training data, and varying illumination conditions, we develop a lightweight semantic segmentation model, LBUnet, which leverages local binary (LB) convolution to effectively handle variable lighting conditions. Next, to address the specific challenge of detecting V-line defects, we propose a false-alarm filtering method that ensures accurate defect identification by utilizing time-series data composed of consecutive segmentation maps generated by LBUnet. Extensive experiments demonstrate that the proposed method delivers both high defect detection accuracy and real-time processing performance for fast-moving fabrics on circular knitting machines under diverse lighting conditions. Specifically, using only LBUnet, our approach achieved an average Mean Intersection over Union (mIoU) of 86.24% with an average processing time of just 4 milliseconds per image. When the false-alarm filtering component was incorporated, the system achieved 100% accuracy in detecting cutlines.
URI: http://scholars.ntou.edu.tw/handle/123456789/26433
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2025.3593335
顯示於:資訊工程學系

顯示文件完整紀錄

Google ScholarTM

檢查

Altmetric

Altmetric

TAIR相關文章


在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

瀏覽
  • 機構典藏
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
DSpace-CRIS Software Copyright © 2002-  Duraspace   4science - Extension maintained and optimized by NTU Library Logo 4SCIENCE 回饋