20 August 1993 Eye-hand relations for sensor placement and object location determination
Author Affiliations +
Proceedings Volume 2059, Sensor Fusion VI; (1993); doi: 10.1117/12.150269
Event: Optical Tools for Manufacturing and Advanced Automation, 1993, Boston, MA, United States
Abstract
Eye-on-hand configuration is an important way to build the active vision system. Most tasks that the eye-on-hand system can perform are based on the estimation of eye-hand relation. Traditionally eye-hand relation is defined as a 3D to 3D coordinate transformation. This kind of definition only views the eye (camera) as a coordinate and it is useful for sensor placement. When this eye-hand relation is used for objects location determination (after the sensor is actively placed), it causes much larger errors than a more direct way which under this situation defines the eye-hand relation as the 3D to 2D perspective transformation between the last joint coordinate and the camera image plane. In this paper the meanings of the eye-hand relations are extended for different tasks: one for sensor placement and one for objects location determination. We also present a new method for the calculation of eye-hand relations by making the last joint coordinate `touchable.' We call it the direct method because some specially designed motions are performed by the robot arm to estimate the relation between the robot base frame and the world frame. When only the rotation matrix is obtained, moving the camera twice and calibrating the 3D camera pose at three stations, the eye-hand relations can then be computed. When the transformation matrix is obtained, calibrating the camera at one station may yield the solution of the eye-hand relations. Experimental results with real data are included. The advantages of the direct method are its efficiency, accuracy, and reproducibility.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Xiang Wan, Guang-you Xu, "Eye-hand relations for sensor placement and object location determination", Proc. SPIE 2059, Sensor Fusion VI, (20 August 1993); doi: 10.1117/12.150269; https://doi.org/10.1117/12.150269
PROCEEDINGS
12 PAGES


SHARE
KEYWORDS
Cameras

Sensors

Calibration

Stereoscopic cameras

Sensor fusion

3D image processing

Active vision

RELATED CONTENT

VOLUMNECT: measuring volumes with Kinect
Proceedings of SPIE (March 06 2014)
Overview Of A Unified Calibration Trio For Robot Eye, Eye...
Proceedings of SPIE (January 05 1989)
Method of 3D road model construction based on sensor fusion
Proceedings of SPIE (October 06 1994)
Wearable 3D measurement
Proceedings of SPIE (January 10 2003)

Back to Top