Translator Disclaimer
Paper
1 November 1992 Estimating position and orientation of planar objects via neural networks: results from simulations
Author Affiliations +
Proceedings Volume 1828, Sensor Fusion V; (1992) https://doi.org/10.1117/12.131660
Event: Applications in Optical Science and Engineering, 1992, Boston, MA, United States
Abstract
For sensor-guided robotic tasks, a calibration procedure must be performed to determine the relationship between information in sensor coordinates and the position and orientation in robot coordinates of the parts to be manipulated. This paper reports on the first stage of research and development of a straightforward approach to the calibration problem in which the robot 'calibrates' the sensors by performing a series of known, carefully-chosen manipulations under observation of the sensors. The data from calibration represent a mapping which relates changes in feature location in sensor coordinates to changes in part position and orientation in robot coordinates. Calibration is completed by solving for the best-fit transformation representing this relationship. In each cycle of the production process, sensor data for the presented part are operated on by the calibration transformation to determine the position and orientation of the grasped part. The key to this procedure of direct calibration is obtaining from the calibration data the best-fit mapping relating changes in feature location in sensor coordinates to changes in part position and orientation in robot coordinates. Simulations were conducted using a simple three-layer artificial neural network to process data from multiple distance sensors to predict changes in position and orientation of a windshield-sized rectangular body. In these simulation, two approaches for supervised learning were used for network training during calibration. In production, the network must be iteratively inverted to predict location of the body from sensor data. Results from these preliminary simulations were encouraging: using data from only four sensor units, changes in position and orientation of the rectangular body were estimated to within a reasonable accuracy for planar part-presentation perturbations spanning an envelope of +/- 50 mm and 10 degree(s). Sources of error and the effects of the different training methods on performance of the network are discussed.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
William R. Murray and Christopher T. Heg "Estimating position and orientation of planar objects via neural networks: results from simulations", Proc. SPIE 1828, Sensor Fusion V, (1 November 1992); https://doi.org/10.1117/12.131660
PROCEEDINGS
13 PAGES


SHARE
Advertisement
Advertisement
Back to Top