Translator Disclaimer
16 October 2000 Navigation in an autonomous flying robot by using a biologically inspired visual odometer
Author Affiliations +
Proceedings Volume 4196, Sensor Fusion and Decentralized Control in Robotic Systems III; (2000)
Event: Intelligent Systems and Smart Manufacturing, 2000, Boston, MA, United States
While mobile robots and walking insects can use proprioceptive information (specialized receptors in the insects' leg, or wheel encoders in robots) to estimate distance traveled, flying agents have to rely mainly on visual cues. Experiments with bees provide evidence that flying insects might be using optical flow induced by egomotion to estimate distance traveled. Recently some details of this odometer have been unraveled. In this study, we propose a biologically inspired model of the bee's visual odometer based on Elementary Motion Detectors (EMDs), and present results from goal-directed navigation experiments with an autonomous flying robot platform that we developed specifically for this purpose. The robot is equipped with a panoramic vision system, which is used to provide input to the EMDs of the left and right visual fields. The outputs of the EMDs are in later stage spatially integrated by wide field motion detectors, and their accumulated response is directly used for the odometer. In a set of initial experiments, the robot moves through a corridor on a fixed route, and the outputs of EMDs, the odometer, are recorded. The results show that the proposed model can be used to provide an estimate of the distance traveled, but the performance depends on the route the robot follows, something which is biologically plausible since natural insects tend to adopt a fixed route during foraging. Given these results, we assumed that the optomotor response plays an important role in the context of goal-directed navigation, and we conducted experiments with an autonomous freely flying robot. The experiments demonstrate that this computationally cheap mechanism can be successfully employed in natural indoor environments.
© (2000) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Fumiya Iida and Dimitrios Lambrinos "Navigation in an autonomous flying robot by using a biologically inspired visual odometer", Proc. SPIE 4196, Sensor Fusion and Decentralized Control in Robotic Systems III, (16 October 2000);


Review of bioinspired real-time motion analysis systems
Proceedings of SPIE (January 19 2006)
A 16 pixel yaw sensor for velocity estimation
Proceedings of SPIE (January 19 2006)
Computational simulation of second order motion perception
Proceedings of SPIE (November 15 2007)
Biomimetic visual detection based on insect neurobiology
Proceedings of SPIE (November 21 2001)

Back to Top