The stereo vision is generally used to obtain the 3D information in traditional three-dimensional measurement. At least two cameras are calibrated in advance and then resection is performed to obtain the three-dimensional coordinates. It can be seen that obtaining 3D information needs at least two cameras (or two views) because single camera can only obtain 2D information. In addition, only the 3D spatial position when the image is captured by camera can be obtained. When we measure 3D information of miss distance of the weapon with high-velocity motion, such as missile, it is hard to capture the image when the weapon touches on the target because the limitation of camera fps (frames per second). Hence, we can only obtain the position of the moment before the weapon touches on the target and this would bring error for miss distance estimation. In this paper, a fast miss distance estimation method is proposed using shadow and single view (i.e., single camera). This proposed method only uses one camera and uses the characteristic that the intersection of the axes of weapon and its shadow is the actual image projection of the moment when the weapon touches on the target. The proposed method dose not need to capture the image of moment when the weapon touches on the target and hence not need high fps, then would extend the range of choice for camera. Experimental results indicate our proposed method has better performance in terms of accuracy, numerical stability and computational speed for miss distance estimation, compared with the traditional stereo vision.
KEYWORDS: 3D acquisition, Calibration, Cameras, 3D metrology, 3D image processing, Photogrammetry, Imaging systems, 3D vision, Covariance matrices, Stereoscopy
Photogrammetry with stereo vision is widely used in computer vision and SLAM (simultaneous localization and mapping), whose key steps are calibration and intersection measurement. Calibration is to obtain the intrinsic and extrinsic parameters, including the principal point, focal length and pose. Intersection measurement is to obtain the 3D information after calibration, including position, velocity and rotation. In some cases, such as visual monitoring cameras (VMCs), photogrammetry uses large field of view, and has the characteristics of long distance from camera to target and wide measuring range, which increase the difficulty of calibration and is unable to place 3D control points arbitrarily. What's more, the distance from the target area to 3D control point area has a great influence on the measuring accuracy of intersection measurement. In this paper, we proposed a new method to place 3D control points, including planar and non-planar scenes and this method can distinguish the two scenes. Then the planar and non-planar methods can be used to calibrate in different cases respectively. In addition, we analyzed the layout of 3D control points to obtain relation between the measuring accuracy and the distance from the target area to 3D control point area. Experimental results show the longer the distance, the greater the measuring error in synthetic data and real images, and to improve the measuring accuracy, the 3D control points should be planar or non-planar strictly, not quasi-planar.
This paper proposes a robust feature matching for slant SAR images based on differentially constrained RANSAC. Firstly, we obtain a set of initial matches provided by SAR-SIFT operator. Then, we set a differentially constrained model with strong constraint in the azimuth direction and no constraint in the distance direction. We use the nearest neighbor contrast (NNC) and the distance offset definition (DOD) techniques to eliminate outliers with large distance distortion. Experiments are carried out on Chinese spaceborne and airborne SAR images. The results show that the proposed method has excellent performance in accuracy, distribution and efficiency, and is suitable for matching SAR images covering areas with large terrain fluctuations such as mountainous areas.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.