1 November 2011 Dynamic (de)focused projection for three-dimensional reconstruction
Author Affiliations +
Abstract
We present a novel 3-D recovery method based on structured light. This method unifies depth from focus (DFF) and depth from defocus (DFD) techniques with the use of a dynamic (de)focused projection. With this approach, the image acquisition system is specifically constructed to keep a whole object sharp in all the captured images. Therefore, only the projected patterns experience different defocused deformations according to the object's depths. When the projected patterns are out of focus, their point-spread function (PSF) is assumed to follow a Gaussian distribution. The final depth is computed by the analysis of the relationship between the sets of PSFs obtained from different blurs and the variation of the object's depths. Our new depth estimation can be employed as a stand-alone strategy. It has no problem with occlusion and correspondence issues. Moreover, it handles textureless and partially reflective surfaces. The experimental results on real objects demonstrate the effective performance of our approach, providing reliable depth estimation and competitive time consumption. It uses fewer input images than DFF, and unlike DFD, it ensures that the PSF is locally unique.
© (2011) Society of Photo-Optical Instrumentation Engineers (SPIE)
Intuon Lertrusdachakul, Intuon Lertrusdachakul, Yohan D. Fougerolle, Yohan D. Fougerolle, Olivier Laligant, Olivier Laligant, } "Dynamic (de)focused projection for three-dimensional reconstruction," Optical Engineering 50(11), 113201 (1 November 2011). https://doi.org/10.1117/1.3644541 . Submission:
JOURNAL ARTICLE
12 PAGES


SHARE
RELATED CONTENT

Robust SVD-based calibration of active range sensors
Proceedings of SPIE (June 28 2000)
Detection of surfaces for projection of texture
Proceedings of SPIE (May 28 2007)

Back to Top