Today, assistive robots are being introduced into human environments at an increasing rate. Human environments are highly cluttered and dynamic, making it difficult to foresee all necessary capabilities and pre-program all desirable future skills of the robot. One approach to increase robot performance is semi-autonomous operation, allowing users to intervene and guide the robot through difficult tasks. To this end, robots need intuitive Human-Machine Interfaces (HMIs) that support fine motion control without overwhelming the operator. In this study we evaluate the performance of several interfaces that balance autonomy and teleoperation of a mobile manipulator for accomplishing several household tasks. Our proposed HMI framework includes teleoperation devices such as a tablet, as well as physical interfaces in the form of piezoresistive pressure sensor arrays. Mobile manipulation experiments were performed with a sensorized KUKA youBot, an omnidirectional platform with a 5 degrees of freedom (DOF) arm. The pick and place tasks involved navigation and manipulation of objects in household environments. Performance metrics included time for task completion and position accuracy.
In recent years, advancements in computer vision, motion planning, task-oriented algorithms, and the availability and cost reduction of sensors, have opened the doors to affordable autonomous robots tailored to assist individual humans. One of the main tasks for a personal robot is to provide intuitive and non-intrusive assistance when requested by the user. However, some base robotic platforms can’t perform autonomous tasks or allow general users operate them due to complex controls. Most users expect a robot to have an intuitive interface that allows them to directly control the platform as well as give them access to some level of autonomous tasks. We aim to introduce this level of intuitive control and autonomous task into teleoperated robotics.
This paper proposes a simple sensor-based HMI framework in which a base teleoperated robotic platform is sensorized allowing for basic levels of autonomous tasks as well as provides a foundation for the use of new intuitive interfaces. Multiple forms of HMI’s (Human-Machine Interfaces) are presented and software architecture is proposed. As test cases for the framework, manipulation experiments were performed on a sensorized KUKA YouBot® platform, mobility experiments were performed on a LABO-3 Neptune platform and Nexus 10 tablet was used with multiple users in order to examine the robots ability to adapt to its environment and to its user.