In this paper, the problem of combined vision/force servo control for robot manipulator is addressed. Three different robot vision control strategies: position-based, image-based, and hybrid control are combined with an impedance-based force controller and a comparison of these three combined vision/force control methods is investigated for the first time, in the context of generic robot kinematic-based sensory-task-space control structure. Furthermore, the issue of contact surface parameters estimation is also investigated. Simulation results have demonstrated that all the above vision/force control strategies are comparable in terms of both the dynamic response and accuracy of positioning and force control.
In this paper, a new open architecture for visual servo control tasks is illustrated. A Puma-560 robotic manipulator is used to prove the concept. This design enables doing hybrid force/visual servo control in an unstructured environment in different modes. Also, it can be controlled through Internet in teleoperation mode using a haptic device. Our proposed structure includes two major parts, hardware and software. In terms of hardware, it consists of a master (host) computer, a slave (target) computer, a Puma 560 manipulator, a CCD camera, a force sensor and a haptic device. There are five DAQ cards, interfacing Puma 560 and a slave computer. An open architecture package is developed using Matlab<sup>(R)</sup>, Simulink<sup>(R)</sup> and XPC target toolbox. This package has the Hardware-In-the-Loop (HIL) property, i.e., enables one to readily implement different configurations of force, visual or hybrid control in real time. The implementation includes the following stages. First of all, retrofitting of puma was carried out. Then a modular joint controller for Puma 560 was realized using Simulink<sup>(R)</sup>. Force sensor driver and force control implementation were written, using <i>sfunction</i> blocks of Simulink<sup>(R)</sup>. Visual images were captured through Image Acquisition Toolbox of Matlab<sup>(R)</sup>, and processed using Image Processing Toolbox. A haptic device interface was also written in Simulink<sup>(R)</sup>. Thus, this setup could be readily reconfigured and accommodate any other robotic manipulator and/or other sensors without the trouble of the external issues relevant to the control, interface and software, while providing flexibility in components modification.