To some extent, the mapping relationship between the camera and projector images determines the precision of surface
reconstruction in digital close-range photogrammetry. In this paper, a new method is presented to achieve sub-pixel-level
mapping between the camera and projector images. Instead of mapping the stripe from the camera to the projector, which
is pixel-precision-based, a set of pixels that share the same decoded number were picked out on the camera images and
their barycenter was calculated to mapped onto the pixel on the projector images. In most cases, the calculation of the
barycenter is able to achieve sub-pixel precision. Compared with existing approaches based on the direct mapping of the
stripe on the camera image to the projector image, the proposed method is characterized by higher accuracy in mapping
the points and thus the surface reconstruction performance. The experimental results are presented to show the
effectiveness of the proposed method in the improvement of the accuracy of shape reconstruction.
To simultaneously perform 3D measurement and camera attitude estimation, an efficient and robust method based on trifocal tensor is proposed in this paper, which only employs the intrinsic parameters and positions of three cameras. The initial trifocal tensor is obtained by using heteroscedastic errors-in-variables (HEIV) estimator and the initial relative poses of the three cameras is acquired by decomposing the tensor. Further the initial attitude of the cameras is obtained with knowledge of the three cameras’ positions. Then the camera attitude and the interested points’ image positions are optimized according to the constraint of trifocal tensor with the HEIV method. Finally the spatial positions of the points are obtained by using intersection measurement method. Both simulation and real image experiment results suggest that the proposed method achieves the same precision of the Bundle Adjustment (BA) method but be more efficient.
When videogammetry (optical measurement) was carried outdoor or under cruel indoor circumstance, the results would be inevitably affected by the atmosphere turbulence. As a result, the precision of surveying was destroyed. The field of air turbulence’s impact on optical measurement was neglected by scholars for a long time, the achievements massed about laser optics and optical communications. The mostly adapted method was noise filtration when the pixel wandering could not be rejected in engineering application, which got little improvement on usual conditions. The principle of influence under atmospheric turbulence on optical measurement is presented in this paper. And experiments data and applications are carried out to announce the impact of atmospheric turbulence. Combining with relevant researches, some essential issues and expectations of the atmospheric turbulence research are proposed.