12 March 2013 Self-calibration of depth sensing systems based on structured-light 3D
Author Affiliations +
Abstract
A structured-light system for depth estimation is a type of 3D active sensor that consists of a structured-light projector, that projects a light pattern on the scene (e.g. mask with vertical stripes), and a camera which captures the illuminated scene. Based on the received patterns, depths of different regions in the scene can be inferred. For this setup to work optimally, the camera and projector must be aligned such that the projection image plane and the image capture plane are parallel, i.e. free of any relative rotations (yaw, pitch and roll). In reality, due to mechanical placement inaccuracy, the projector-camera pair will not be aligned. In this paper we present a calibration process which measures the misalignment. We also estimate a scale factor to account for differences in the focal lengths of the projector and the camera. The three angles of rotation can be found by introducing a plane in the field of view of the camera and illuminating it with the projected light patterns. An image of this plane is captured and processed to obtain the relative pitch, yaw and roll angles, as well as the scale through an iterative process. This algorithm leverages the effects of the misalignment/ rotation angles on the depth map of the plane image.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Vikas Ramachandra, James Nash, Kalin Atanassov, Sergio Goma, "Self-calibration of depth sensing systems based on structured-light 3D", Proc. SPIE 8650, Three-Dimensional Image Processing (3DIP) and Applications 2013, 86500V (12 March 2013); doi: 10.1117/12.2008556; https://doi.org/10.1117/12.2008556
PROCEEDINGS
11 PAGES


SHARE
Back to Top