5 January 1989 Spatial Reasoning In The Treatment Of Systematic Sensor Errors
Author Affiliations +
Proceedings Volume 1003, Sensor Fusion: Spatial Reasoning and Scene Interpretation; (1989) https://doi.org/10.1117/12.948948
Event: 1988 Cambridge Symposium on Advances in Intelligent Robotics Systems, 1988, Boston, MA, United States
Abstract
In processing ultrasonic and visual sensor data acquired by mobile robots systematic errors can occur. The sonar errors include distortions in size and surface orientation due to the beam resolution, and false echoes. The vision errors include, among others, ambiguities in discriminating depth discontinuities from intensity gradients generated by variations in surface brightness. In this paper we present a methodology for the removal of systematic errors using data fror the sonar sensor domain to guide the processing of information in the vision domain, and vice versa. During the sonar data processing some errors are removed from 2D navigation maps through pattern analyses and consistent-labelling conditions, using spatial reasoning about the sonar beam and object characteristics. Others are removed using visual information. In the vision data processing vertical edge segments are extracted using a Canny-like algorithm, and are labelled. Object edge features are then constructed from the segments using statistical and spatial analyses. A least-squares method is used during the statistical analysis, and sonar range data are used in the spatial analysis.
© (1989) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Martin Beckerman, Judson P. Jones, Reinhold C. Mann, Leslie A. Farkas, Stephen E. Johnston, "Spatial Reasoning In The Treatment Of Systematic Sensor Errors", Proc. SPIE 1003, Sensor Fusion: Spatial Reasoning and Scene Interpretation, (5 January 1989); doi: 10.1117/12.948948; https://doi.org/10.1117/12.948948
PROCEEDINGS
12 PAGES


SHARE
Back to Top