17 May 2013 Robust leader tracking from an unmanned ground vehicle
Author Affiliations +
Abstract
While many leader-follower technologies for robotic mules have been developed in recent years, the problem of reliably tracking and re-acquiring a human leader through cluttered environments continues to pose a challenge to widespread acceptance of these systems. Recent approaches to leader tracking rely on leader-worn equipment that may be damaged, hidden from view, or lost, such as radio transmitters or special clothing, as well as specialized sensing hardware such as high-resolution LIDAR. We present a vision-based approach for robustly tracking a leader using a simple monocular camera. The proposed method requires no modification to the leader’s equipment, nor any specialized sensors on board the host platform. The system learns a discriminative model of the leader’s appearance to robustly track him or her through long occlusions, changing lighting conditions, and cluttered environments. We demonstrate the system’s tracking capabilities on publicly available benchmark datasets, as well as in representative scenarios captured using a small unmanned ground vehicle (SUGV).
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Camille S. Monnier, Stan German, Andrey Ostapchenko, "Robust leader tracking from an unmanned ground vehicle", Proc. SPIE 8741, Unmanned Systems Technology XV, 87410D (17 May 2013); doi: 10.1117/12.2016013; https://doi.org/10.1117/12.2016013
PROCEEDINGS
8 PAGES


SHARE
Back to Top