PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This paper presents a pose estimation method based on a 3D camera - the SwissRanger SR4000. The proposed method
estimates the camera's ego-motion by using intensity and range data produced by the camera. It detects the SIFT (Scale-
Invariant Feature Transform) features in one intensity image and match them to that in the next intensity image. The
resulting 3D data point pairs are used to compute the least-square rotation and translation matrices, from which the
attitude and position changes between the two image frames are determined. The method uses feature descriptors to
perform feature matching. It works well with large image motion between two frames without the need of spatial
correlation search. Due to the SR4000's consistent accuracy in depth measurement, the proposed method may achieve a
better pose estimation accuracy than a stereovision-based approach. Another advantage of the proposed method is that
the range data of the SR4000 is complete and therefore can be used for obstacle avoidance/negotiation. This makes it
possible to navigate a mobile robot by using a single perception sensor. In this paper, we will validate the idea of the
pose estimation method and characterize the method's pose estimation performance.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Cang Ye, Michael Bruch, "A visual odometry method based on the SwissRanger SR4000," Proc. SPIE 7692, Unmanned Systems Technology XII, 76921I (7 May 2010); https://doi.org/10.1117/12.850349