Translator Disclaimer
18 February 2002 Dynamic composition of tracking primitives for interactive vision-guided navigation
Author Affiliations +
Proceedings Volume 4573, Mobile Robots XVI; (2002)
Event: Intelligent Systems and Advanced Manufacturing, 2001, Boston, MA, United States
We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.
© (2002) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Darius Burschka and Gregory D. Hager "Dynamic composition of tracking primitives for interactive vision-guided navigation", Proc. SPIE 4573, Mobile Robots XVI, (18 February 2002);

Back to Top