13 May 2016 Human-like object tracking and gaze estimation with PKD android
Author Affiliations +
Abstract
As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD’s ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold: to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Indika B. Wijayasinghe, Indika B. Wijayasinghe, Haylie L. Miller, Haylie L. Miller, Sumit K. Das, Sumit K. Das, Nicoleta L. Bugnariu, Nicoleta L. Bugnariu, Dan O. Popa, Dan O. Popa, } "Human-like object tracking and gaze estimation with PKD android", Proc. SPIE 9859, Sensors for Next-Generation Robotics III, 985906 (13 May 2016); doi: 10.1117/12.2224382; https://doi.org/10.1117/12.2224382
PROCEEDINGS
14 PAGES


SHARE
RELATED CONTENT


Back to Top