LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

D-CenterNet: An Anchor-Free Detector With Knowledge Distillation for Industrial Defect Detection

Lightweight anchor-free detectors are currently gaining more and more popularity in the field of industrial defect detection. In general, it is difficult to achieve competitive performance compared to deep anchor-based… Click to show full abstract

Lightweight anchor-free detectors are currently gaining more and more popularity in the field of industrial defect detection. In general, it is difficult to achieve competitive performance compared to deep anchor-based detector. Knowledge distillation is an effective way to solve this problem. In this article, we propose a novel anchor-free method based on knowledge distillation, called D-CenterNet. In our method, we employ an optimization approach using an adaptive label encoding strategy to better regress the centroid information of extreme aspect ratio (EAR) defects and improve detection accuracy. Backbone is improved for industrial defects with large aspect ratios and complex backgrounds, and strip pooling (SP) module is introduced to extract image features more efficiently. By redefining the distilled knowledge as the interrelationship between the output information, we train the student model using the pseudo-label generated by the teacher model, and then improve the model using the real label data to further enhance the performance of the student model without increasing model complexity. The experimental results on Aliyun Tianchi fabric dataset demonstrate that D-CenterNet achieves 75.5% mean average precision (mAP) and increases the mAP of the student model (60.1%, before distillation) to 63.8% (after distillation) without increasing the complexity of the model. Extensive experimental results validate the superiority of our method and demonstrate that our D-CenterNet achieves state-of-the-art (SOTA) performance in both accuracy and real-time performance.

Keywords: distillation; centernet; anchor free; knowledge; model

Journal Title: IEEE Transactions on Instrumentation and Measurement
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.