While mobile robots and walking insects can use proprioceptive information (specialized receptors in the insects' leg, or wheel encoders in robots) to estimate distance traveled, flying agents have to rely mainly on visual cues. Experiments with bees provide evidence that flying insects might be using optical flow induced by egomotion to estimate distance traveled. Recently some details of this odometer have been unraveled. In this study, we propose a biologically inspired model of the bee's visual odometer based on Elementary Motion Detectors (EMDs), and present results from goal-directed navigation experiments with an autonomous flying robot platform that we developed specifically for this purpose. The robot is equipped with a panoramic vision system, which is used to provide input to the EMDs of the left and right visual fields. The outputs of the EMDs are in later stage spatially integrated by wide field motion detectors, and their accumulated response is directly used for the odometer. In a set of initial experiments, the robot moves through a corridor on a fixed route, and the outputs of EMDs, the odometer, are recorded. The results show that the proposed model can be used to provide an estimate of the distance traveled, but the performance depends on the route the robot follows, something which is biologically plausible since natural insects tend to adopt a fixed route during foraging. Given these results, we assumed that the optomotor response plays an important role in the context of goal-directed navigation, and we conducted experiments with an autonomous freely flying robot. The experiments demonstrate that this computationally cheap mechanism can be successfully employed in natural indoor environments.