1 April 1991 Hand-eye coordination for grasping moving objects
Author Affiliations +
Proceedings Volume 1383, Sensor Fusion III: 3D Perception and Recognition; (1991) https://doi.org/10.1117/12.25255
Event: Advances in Intelligent Robotics Systems, 1990, Boston, MA, United States
Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object’s 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm.
© (1991) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Peter K. Allen, Peter K. Allen, Billibon Yoshimi, Billibon Yoshimi, Alexander Timcenko, Alexander Timcenko, Paul Michelman, Paul Michelman, } "Hand-eye coordination for grasping moving objects", Proc. SPIE 1383, Sensor Fusion III: 3D Perception and Recognition, (1 April 1991); doi: 10.1117/12.25255; https://doi.org/10.1117/12.25255


Digital JTC simulator for stereo vision system
Proceedings of SPIE (October 17 1999)
Augmented reality system
Proceedings of SPIE (September 07 2010)
Real-time multiple-object tracking and anomaly detection
Proceedings of SPIE (January 16 2005)

Back to Top