In the following, optical 3D-measurement systems based on fringe projection techniques according to the principle of phasogrammetry are introduced. These self-calibrating measuring systems allow the automated measurement of complex objects. In combination with adequate software tools to evaluate the data, a variety of different tasks can be performed in a productive environment.
Optical 3D measurement systems are used in a lot of applications, for instance for quality control and reverse engineering. Active optical 3D measurement systems are very often based on the fringe projection technique. The key element of such a shape measurement system is the projection unit, where nowadays digital projection units based on LCD, LCoS, or DMD technology are used. Despite of a lot of opportunities these displays reveal some disadvantages for the use in fringe projection, which can reduce the phase measurement accuracy.
In contrast to slide-like structures, these displays are separated in small picture elements -- so called active pixels -- with dimensions of about 8x8μm2 up to 20x20μm2. For this reason the cos function which has to be projected is spatially quantized. Therefore this function is not continuous. Another problem is the reproduction of the correct intensities given by the cos function in the projection. Usually, every projection engine shows some kind of gamma correction function or nonlinearity. This is either caused by the micro display itself or (in the case of video beamers) it is implemented into the beamer’s electronic to project proper visual images. All these effects will distort the intensity to be projected.
All the mentioned effects cause phase measurement errors of different strength. Here we discuss different approaches to determine and compensate these influences of the projection engine to the 3D measurement. If the compensation is well accomplished, the accuracy of the measurement results may increase by up to one magnitude.
Procedures for measuring the three-dimensional shape of objects using fringe projection are well known for many years. Those systems are suitable for instance for quality control and reverse engineering. For recognition and visualization of objects it's necessary to obtain the shape and also the color and/or the texture of the object's surface. Furthermore, in quality control like defect identification the color information of the surface can be useful. But the use of color cameras to capture all data has the lack of lower resolution or lower frame rates and may cause problems while measuring the shape (with one chip cameras) because different colors are recognized by different pixels looking on different points of the surface. This will cause measurement errors. Here we propose a method basing on the fringe projection technique, which is able to determine the 3-D co-ordinates as well as the color at all measurement points while using only black/white cameras. The main advantage of this method is that color and shape are obtained together with the same (high resolution) camera and that the shape and color information are put together without any matching procedures and are obtained within the same measurement. For this the object is illuminated within the same projection sequence with non structured full frame colored light, e.g. red, green, blue and the well known fringe patterns. As the final result one gets at each measurement point three co-ordinate values containing the position and three-color values containing the real color.
For measuring the 3D shape of complex objects by optical methods the optical sensor or the object have to be moved into multiple, overlapping measuring positions so as to view the entire surface. The resulting point clouds taken from the different views then have to be merged into a common coordinate system to obtain the final complete 3D view. Here we propose concepts of 3D-measurement arrangements using structured-light illumination with a digital-light projection unit to obtain a full-body view within a self- calibrating measurement strategy, whereas the necessary merging of the single views takes place fully automatically and done without any marker on the object surface, objects features, other merging procedure or high accurate object/sensor handling system. On the basis of this strategy different mobile and stationary arrangements are proposed and realized. A first integration in an industrial process will be presented showing the power of this concept by measuring the complete 3D shape of automotive parts and design objects within volume of 1dm3 up to 1m3. The measurements with this system showed a coordinate measurement accuracy of up to 10-5 of the field size.
A self-calibrating 3D measurement system using structured- light illumination with a digital-light projection unit (DMD) will be reported, which ensures a high number of object points, quick data acquisition, and a simultaneous determination of coordinates and system parameters (self calibration), making the system completely insensitive to environmental changes. Furthermore, there is no necessity of any marker on the object surface and a subsequent matching of the single views is not required to obtain a full-body measurement. For this reason, the object under test is successively illuminated with two grating sequences perpendicular to each other from different directions resulting in surplus phase values for each measurement point. On the basis of this phase value one can calculate the orientation parameters as well as the 3D-coordinates online. Two different measurement set-ups will be reported, which have the ability to measure the entire surface (full-body measurement).