In comparison to traditional camera-eye visual system of mammals such as human beings, the compound vision system of a common house fly, Musca domestica, is capable of enhanced motion detection and tracking. Most computer vision systems today have higher spatial resolution but lower motion detection and tracking capabilities when compared to compound vision. In applications requiring obstacle avoidance and quick visual data processing, compound vision is more suitable as compared to normal mammalian inspired vision systems. Proof-of-concept has shown that even without using a computer processing system, compound vision sensors can mimic the motion hyperacuity characteristic of the fly’s visual system and at the same time provide near instantaneous edge detection and motion tracking results. While these early prototypes were successfully implemented in device-level analog components, we were interested in determining if a similar system could be implemented as an embedded system. The first step of this process is determining the best way to digitize the signals without losing the key feature of compound vision, motion hyperacuity.