18 January 2010 Auto-preview camera orientation for environment perception on a mobile robot
Author Affiliations +
Using wide-angle or omnidirectional camera lenses to increase a mobile robot's field of view introduces nonlinearity in the image due to the 'fish-eye' effect. This complicates distance perception, and increases image processing overhead. Using multiple cameras avoids the fish-eye complications, but involves using more electrical and processing power to interface them to a computer. Being able to control the orientation of a single camera, both of these disadvantages are minimized while still allowing the robot to preview a wider area. In addition, controlling the orientation allows the robot to optimize its environment perception by only looking where the most useful information can be discovered. In this paper, a technique is presented that creates a two dimensional map of objects of interest surrounding a mobile robot equipped with a panning camera on a telescoping shaft. Before attempting to negotiate a difficult path planning situation, the robot takes snapshots at different camera heights and pan angles and then produces a single map of the surrounding area. Distance perception is performed by making calibration measurements of the camera and applying coordinate transformations to project the camera's findings into the vehicle's coordinate frame. To test the system, obstacles and lines were placed to form a chicane. Several snapshots were taken with different camera orientations, and the information from each were stitched together to yield a very useful map of the surrounding area for the robot to use to plan a path through the chicane.
© (2010) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Micho Radovnikovich, Micho Radovnikovich, Pavan K. Vempaty, Pavan K. Vempaty, Ka C. Cheok, Ka C. Cheok, } "Auto-preview camera orientation for environment perception on a mobile robot", Proc. SPIE 7539, Intelligent Robots and Computer Vision XXVII: Algorithms and Techniques, 75390Q (18 January 2010); doi: 10.1117/12.839139; https://doi.org/10.1117/12.839139


Automatic calibration and neural networks for robot guidance
Proceedings of SPIE (September 29 2003)
Navigating Mobile Robot With A Supervisory Vision System
Proceedings of SPIE (December 10 1985)
Line following using a two camera guidance system for a...
Proceedings of SPIE (October 28 1996)
MOBLAB a mobile laboratory for testing real time vision...
Proceedings of SPIE (January 08 1995)
Evaluating distances using a coded lens camera and a blur...
Proceedings of SPIE (February 07 2011)

Back to Top