To carry out complex tasks, such as reconnaissance missions in relatively unknown terrain, autonomous systems will need to constantly sense and perceive various aspects of their local environment. Range and motion detection capabilities will be required by an autonomous agent to create an internal representation of the environment that will be used in mission planning. Conventionally, shape-from-binocular stereo has been a popular, yet compute intensive and hence slow, technique for detecting range using passive sensors. In this paper, we present a pipeline architecture that performs correlation-based stereo matching and motion detection in near real-time. The system has been implemented using DATACUBE image processing boards and can match 256 x 256 pixel stereo image pairs in one second using a search range of 64 pixels. It has been tested on various indoor and outdoor images with generally successful results and is being used for obstacle detection in our work on autonomous navigation. The system demonstrates the feasibility of real-time stereo matching in the near future using miniaturized hardware that fits inside a vehicle. After discussing our approach, we present results using real images. We present the application of this work to autonomous navigation research.
Ali E. Kayaalp,
"Towards real-time binocular stereo range and motion detection", Proc. SPIE 1295, Real-Time Image Processing II, (1 September 1990); doi: 10.1117/12.21243; https://doi.org/10.1117/12.21243