In order to achieve real-time three-dimensional (3-D) reconstructions by using structured light illumination (SLI), reducing the number of projected patterns to as few as possible runs into practical limitations from the hardware. For example, Liu et al.1 demonstrated both theoretically and experimentally that the gamma distortion is of little or no consequence when scanning with more than eight patterns. As a result, Liu et al. proposed postprocessing methods which are very time-consuming.
Wang et al.2 proposed a gamma compensation method by applying an inverse gamma function to their patterns before the patterns were projected, but in practice, the nonlinearity of a projector cannot be simply described by a single parameter.3 Su et al.4 proposed a method by defocusing their projector to filter out the high-order harmonics of patterns, but this has the unwanted consequence of reducing the signal power of the signals. Although Zhang et al. and Xu et al. developed several lookup-table-based solutions56.–7 to compensate for distorted phase, their accuracy of compensation depends greatly on the length of tables as well as the method of interpolation.
In this letter, we intend to ignore the gamma model limitation of these previous studies and allow for arbitrary projector responses, and we intend to make use of the coding relationship between phase and intensities to propose a robust and universal intensity precompensation algorithm for solving for nonlinearities in grayscale SLI pattern schemes. Our process involves measuring the phase response of the scanner, distorted by nonlinearities in the projector, to produce an intensity precompensation look-up table (IPLUT) and then precompensating the SLI patterns with the IPLUT before scanning such that postprocessing is no longer warranted. Theoretically, this method can be applied to any SLI pattern scheme but will be verified by three-step phase measuring profilometry (PMP). Experimental results will show a to reduction in the root-mean-square (RMS) of phase error, depending on noise suppression through smoothing filters.
As an active stereo vision technique, SLI computes the correspondence between a pixel, , in a camera and a pixel, , in a projector based upon decoding the structured patterns. In the case of an epipolar-rectified camera/projector pair, deriving correspondences for need only search for the corresponding coordinate . A common expression for the projected intensity for coded by visible light intensity is given by
In order to calibrate the nonlinear relationship between and in Eq. (1) via Eq. (2), an intuitive approach is to project a series of constant intensity patterns ranging from 0 to ( is the grayscale level), and then using the measured output intensities recorded by the camera to build a tone correction curve to linearize the projector. But again, this requires a calibration procedure using patterns. If we instead take advantage of the coding principle of SLI, i.e., the relationship between the phase and the intensities, we can significantly reduce the number of calibration patterns and robustly calibrate the nonlinearities of the SLI system.
For the purpose of eliminating the phase error introduced by projector nonlinearities in of Eq. (1), we employ uncompensated PMP on a target surface (not necessarily flat nor white) to obtain a reference phase map where the ’th of projected patterns is defined as1
In situation, where is linearly coded as2) by canceling and , and we can, therefore, derive a look-up table mapping the nonlinearly distorted phase image () to the reference phase image () as 7) as an intensity precompensation step, we apply to the intensity values, , of Eq. (3) for the projected patterns stored in memory, as a one-time process, such that the captured intensities will linearly respond and no any postprocessing is needed to correct nonlinearities.
In our experiments, we calibrate the nonlinearities of our system and construct an IPLUT and then employ three-step (the least numbered) PMP to verify the proposed intensity precompensation algorithm. Our SLI system includes an AVT (Burnaby, Canada) Prosilica GC650M camera and a Casio XJ-A155V projector. Also, our verification targets include a nearly flat and white plaster board wall, a color-textured pillow [Fig. 1(a)] and a color-textured plush toy bear [Fig. 1(b)]. The textured pillow is used as a nonlinearity calibration to show that our proposed procedure does not involve a classic flat and textureless target; the white wall is used for evaluating the performance of our algorithm for phase error; and the textured bear is used for visually demonstrating the effect of our method.
Using our textured pillow along with PMP patterns generated by Eq. (3) using parameters , , and (for ), is computed through Eq. (4) with extracted from the pattern coded with Eq. (6). The relationship of versus is shown in Fig. 2 with the dashed curve. In Eq. (7), an IPLUT is constructed as shown in the solid curve in Fig. 2. We note that the exact method for deriving this mapping can be any traditional curve fitting technique where missing intensity values in either the distorted or ideal intensity map are derived by interpolation from the available measurements. Also, after the intensities of the directly coded pattern are precompensated through the IPLUT, we can see that the new intensity response linearly varies as illustrated in the dotted line in Fig. 2.
Using a nearly flat wall with uncorrected three-step PMP (, ), the absolute phase error for the phase computed by Eq. (5) is severely distorted as illustrated in Fig. 3 (solid line), which is the cross-section of phase error along the 320th column. After the IPLUT, derived from the textured pillow, is applied, the phase error is significantly reduced, as illustrated by the dashed line in Fig. 3. When we compare the RMS of errors of the two, we see a reduction by in error from 0.2409 to 0.0096. Although we investigate the whole valid area of the phase image, the RMS of phase error is reduced from 0.2467 to 0.0098 still by . Moreover, if we suppress noise by averaging 20 groups of scans as Liu et al. did in Ref. 1, the phase error will be further reduced by a factor of 60.
Now, using the phase computed from the intensity-precompensated three-step PMP patterns, we can see the significant improvement in 3-D scan quality for the color-textured bear, as shown in Fig. 4(c), compared with the depth extracted by a first generation of Microsoft (Redmond, Washington) Kinect camera [Fig. 4(a)] and our SLI system before the nonlinearity is precompensated [Fig. 4(b)], respectively. In Fig. 4(b), obvious wave-like nonlinear distortion is shown in the 3-D reconstruction of the bear and the RMS of error along depth direction is 0.5930, while in Fig. 4(c), such distortion is effectively corrected and the RMS of error is decreased to 0.0258 by .
In this letter, we propose a universal intensity precompensation algorithm to correct nonlinear distortion for an SLI system. With the algorithm, intensities of the projected patterns can be precompensated before being projected such that no postprocessing is needed. Thus, we ultimately demonstrate that we can continue to use a minimum number of SLI patterns without extensive correction procedures. Verified by three-step PMP, the RMS of phase error can be reduced by without suppressing noise and with suppressing noise. We believe the proposed algorithm is helpful for those one-shot color coding algorithms, and it is our future work.
This work was sponsored, in part, by the research startup funding of Sichuan University, China, under contract #2082204164059, and by the Science and Technology Support Program of Sichuan Province, China, under contract #2014GZ0005.