24 October 2017 A real-time standard parts inspection based on deep learning
Author Affiliations +
Proceedings Volume 10458, AOPC 2017: 3D Measurement Technology for Intelligent Manufacturing; 104580S (2017) https://doi.org/10.1117/12.2284115
Event: Applied Optics and Photonics China (AOPC2017), 2017, Beijing, China
Since standard parts are necessary components in mechanical structure like bogie and connector. These mechanical structures will be shattered or loosen if standard parts are lost. So real-time standard parts inspection systems are essential to guarantee their safety. Researchers would like to take inspection systems based on deep learning because it works well in image with complex backgrounds which is common in standard parts inspection situation. A typical inspection detection system contains two basic components: feature extractors and object classifiers. For the object classifier, Region Proposal Network (RPN) is one of the most essential architectures in most state-of-art object detection systems. However, in the basic RPN architecture, the proposals of Region of Interest (ROI) have fixed sizes (9 anchors for each pixel), they are effective but they waste much computing resources and time. In standard parts detection situations, standard parts have given size, thus we can manually choose sizes of anchors based on the ground-truths through machine learning. The experiments prove that we could use 2 anchors to achieve almost the same accuracy and recall rate. Basically, our standard parts detection system could reach 15fps on NVIDIA GTX1080 (GPU), while achieving detection accuracy 90.01% mAP.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Kuan Xu, Kuan Xu, XuDong Li, XuDong Li, Hongzhi Jiang, Hongzhi Jiang, Huijie Zhao, Huijie Zhao, } "A real-time standard parts inspection based on deep learning", Proc. SPIE 10458, AOPC 2017: 3D Measurement Technology for Intelligent Manufacturing, 104580S (24 October 2017); doi: 10.1117/12.2284115; https://doi.org/10.1117/12.2284115

Back to Top