25 October 2004 Self-calibration algorithms for cameras and laser range finders
Author Affiliations +
Autonomous systems that navigate through unknown and unstructured environments must solve the ego-motion estimation problem. Fusing the information from many different sensors makes this motion estimation more stable, but requires that the relative position and orientation of these sensors be known. Self-calibration algorithms are the most useful for this calibration problem because the do not require any known feature in the environment and can be used during system operation. Here we give geometric constraints, the coherent motion constraints, that allow a framework for the development of self-calibration algorithms for a heterogeneous sensor system (such as cameras, laser range finders, and odometry). If, for all sensors, a conditional probability density function can be defined to relate sensor measurements to the sensor motion, then the coherent motion constraints allows a maximum likelihood formulation of the sensor calibration problem. We present complete algorithms here for the case of a camera and laser range finder, in the case of both discrete and differential motions.
© (2004) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qilong Zhang, Qilong Zhang, Robert B. Pless, Robert B. Pless, } "Self-calibration algorithms for cameras and laser range finders", Proc. SPIE 5608, Intelligent Robots and Computer Vision XXII: Algorithms, Techniques, and Active Vision, (25 October 2004); doi: 10.1117/12.571546; https://doi.org/10.1117/12.571546


Vision-based intelligent robots
Proceedings of SPIE (June 27 2000)
Repository of sensor data for autonomous driving research
Proceedings of SPIE (September 29 2003)

Back to Top