In recent years, advancements in computer vision, motion planning, task-oriented algorithms, and the availability and cost reduction of sensors, have opened the doors to affordable autonomous robots tailored to assist individual humans. One of the main tasks for a personal robot is to provide intuitive and non-intrusive assistance when requested by the user. However, some base robotic platforms can’t perform autonomous tasks or allow general users operate them due to complex controls. Most users expect a robot to have an intuitive interface that allows them to directly control the platform as well as give them access to some level of autonomous tasks. We aim to introduce this level of intuitive control and autonomous task into teleoperated robotics. This paper proposes a simple sensor-based HMI framework in which a base teleoperated robotic platform is sensorized allowing for basic levels of autonomous tasks as well as provides a foundation for the use of new intuitive interfaces. Multiple forms of HMI’s (Human-Machine Interfaces) are presented and software architecture is proposed. As test cases for the framework, manipulation experiments were performed on a sensorized KUKA YouBot® platform, mobility experiments were performed on a LABO-3 Neptune platform and Nexus 10 tablet was used with multiple users in order to examine the robots ability to adapt to its environment and to its user. |
ACCESS THE FULL ARTICLE
No SPIE Account? Create one
CITATIONS
Cited by 2 scholarly publications.
Sensors
Robots
Robotics
Control systems
Human-machine interfaces
Neptune
Tablets