Translator Disclaimer
23 May 2013 Lighting estimation in fringe images during motion compensation for 3D measurements
Author Affiliations +
Fringe projection is an established method to measure the 3D structure of macroscopic objects. To achieve both a high accuracy and robustness a certain number of images with pairwise different projection pattern is required. Over this sequence it is necessary that each 3D object point corresponds to the same image point at every time. This situation is no longer given for measurements under motion. One possibility to solve this problem is to restore the static situation. Therefore, the acquired camera images have to be realigned and secondly, the degree of fringe shift has to be estimated. Furthermore, there exists another variable: change in lighting. The compensation of these variances is a difficult task and could only be realized with several assumptions, but it has to be approximately determined and integrated into the 3D reconstruction process. We propose a method to estimate these lighting changes for each camera pixel with respect to their neighbors at each point in time. The algorithms were validated on simulation data, in particular with rotating measurement objects. For translational motion, lighting changes have no severe effect in our applications. Taken together, without using high-speed hardware our method results in a motion compensated dense 3D point cloud which is eligible for three-dimensional measurement of moving objects or setups with sensor systems in motion.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Andreas Breitbarth, Peter Kühmstedt, Gunther Notni, and Joachim Denzler "Lighting estimation in fringe images during motion compensation for 3D measurements", Proc. SPIE 8791, Videometrics, Range Imaging, and Applications XII; and Automated Visual Inspection, 87910P (23 May 2013);


Back to Top