Optical time-of-flight (TOF) distance measurements can be performed using so-called smart lock-in pixels. By sampling the optical signal 2, 4 or n times in each pixel synchronously with the modulation frequency, the phase between the emitted and reflected signal is extracted and the object's distance is determined. The high
integration-level of such lock-in pixels enables the real-time acquisition of the three-dimensional environment without using any moving mechanical components. A novel design of the 2-tap lock-in pixel in a 0.6 μm semiconductor technology is presented. The pixel was implemented on a sensor with QCIF resolution. The optimized
pixel design allows for high-speed operation of the device, resulting in a nearly-optimum demodulation performance and precise distance measurements which are almost exclusively limited by photon shot noise. In-pixel background-light suppression allows the sensor to be operated in an outdoor environment with sunlight incidence. The highly complex pixel functionality of the sensor was successfully demonstrated on the new SwissRanger SR3000 3D-TOF camera design. Distance resolutions in the millimeter range have been achieved
while the camera is operating with frame rates of more than 20Hz.
The time-of-flight (TOF) principle is a well known principle to acquire a scene in all three dimensions. The advantages of the knowledge of the third dimension are obvious for many kinds of applications. The distance information within the scene renders automatic systems more robust and much less complex or even enables completely new solutions. A solid-state image sensor containing 124 x 160 pixels and the corresponding 3D-camera, the so-called SwissRanger camera, has already been presented in detail in . It has been shown that the SwissRanger camera achieves depth resolutions in the sub-centimeter range, corresponding to a measured time resolution of a few tens of picoseconds with respect to the speed of light (c~3•108 m/s).
However, one main drawback of these so-called lock-in TOF pixels is their limited capacity to handle background illumination. Keeping in mind that in outdoor applications the optical power on the sensor originating from background illumination (e.g., sun light) may be up to a few 100 times higher than the power of the modulated illumination, the sensor requires new pixel structures eliminating or at least reducing the currently experienced restrictions in terms of background illumination.
Based on a 0.6 µm CMOS/CCD technology, four new pixel architectures suppressing background illumination and/or improving the ratio of modulated signal to background signal at the pixel-output level were developed and will be presented in this paper. The theoretical principle of operation and the expected performance are described in detail, together with a sketch of the implementation of the different pixel designs at silicon level. Furthermore, test results obtained in a laboratory environment are published. The sensor structures are characterized in a high background-light environment with up to sun light conditions. The distance linearity over a range of a few meters with the mentioned light conditions is measured. At the same time, the distance resolution is plotted as a function of the target distance, the integration time and the background illumination power. This in-depth evaluation leads to a comparison of the various background suppression approaches; it also includes a comparison with the traditional pixel structure in order to highlight the benefits of the new approaches.
The paper concludes by providing parameter estimations which enables the outlook to build a sensor with a high lateral resolution containing the most promising pixel.
A new pixel structure for the demodulation of intensity modulated
light waves is presented. The integration of such pixels in line
and area array sensors finds application in time-of-flight
three-dimensional imaging. In 3D range imaging an illumination
module sends a modulated optical signal to a target, where it is
reflected back to the sensor. The phase shift of the reflected
signal compared to the emitted signal is proportional to the
distance to one point of the target. The detection and
demodulation of the signal is performed by a new pixel structure
named drift field pixel. The sampling process is based on the fast
separation of photogenerated charge due to lateral electrical
fields below a high-resistive transparent poly-Si photogate. The
dominant charge transfer phenomenon of drift, instead of diffusion
as in conventional CCD pixels, allows much higher modulation
frequencies of up to 1 GHz and a much higher ultimate distance
accuracy as a consequence. First measurements performed with a
prototype pixel array of 3x3 pixels in a 0.8 micron technology
confirm the suitability of the pixels for applications in the
field of 3D-imaging. Depth accuracies in the sub centimeter range
have already been achieved.