25 October 2017 The seam visual tracking method for large structures
Author Affiliations +
Proceedings Volume 10464, AOPC 2017: Fiber Optic Sensing and Optical Communications; 104641R (2017) https://doi.org/10.1117/12.2285500
Event: Applied Optics and Photonics China (AOPC2017), 2017, Beijing, China
In this paper, a compact and flexible weld visual tracking method is proposed. Firstly, there was the interference between the visual device and the work-piece to be welded when visual tracking height cannot change. a kind of weld vision system with compact structure and tracking height is researched. Secondly, according to analyze the relative spatial pose between the camera, the laser and the work-piece to be welded and study with the theory of relative geometric imaging, The mathematical model between image feature parameters and three-dimensional trajectory of the assembly gap to be welded is established. Thirdly, the visual imaging parameters of line structured light are optimized by experiment of the weld structure of the weld. Fourth, the interference that line structure light will be scatters at the bright area of metal and the area of surface scratches will be bright is exited in the imaging. These disturbances seriously affect the computational efficiency. The algorithm based on the human eye visual attention mechanism is used to extract the weld characteristics efficiently and stably. Finally, in the experiment, It is verified that the compact and flexible weld tracking method has the tracking accuracy of 0.5mm in the tracking of large structural parts. It is a wide range of industrial application prospects.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qilin Bi, Qilin Bi, Xiaomin Jiang, Xiaomin Jiang, Xiaoguang Liu, Xiaoguang Liu, Taobo Cheng, Taobo Cheng, Yulong Zhu, Yulong Zhu, } "The seam visual tracking method for large structures", Proc. SPIE 10464, AOPC 2017: Fiber Optic Sensing and Optical Communications, 104641R (25 October 2017); doi: 10.1117/12.2285500; https://doi.org/10.1117/12.2285500

Back to Top