Translator Disclaimer
22 May 2013 Low frame rate video target localization and tracking testbed
Author Affiliations +
Traditional tracking frameworks are challenged by low video frame rate scenarios, because the appearances and locations of the target may change considerably in consecutive frames. Our paper presents a saliency-based temporal association dependency (STAD) framework to deal with such a low frame rate scenario and demonstrate good results in our robot testbed. We first use median filter to create a background of the scene, then apply background subtraction to every new frame to decide the rough position of the target. With the help of the markers on the robots, we use a gradient voting algorithm to detect the high responses of the directions of the robots. Finally, a template matching with branch pruning is used to obtain the finer estimation of the pose of the robots. To make the tracking-by-detection framework stable, we further introduce the temporal constraints using a previously detected result as well as an association technique. Our experiments show that our method can achieve a very stable tracking result and outperforms some state-of-the-art trackers such as Meanshift, Online-AdaBoosting, Mulitple-Instance-Learning, Tracking-Learning-Detection etc. Also. we demonstrate that our algorithm provides near real-time solutions given the low frame rate requirement.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yu Pang, Dan Shen, Genshe Chen, Pengpeng Liang, Khanh Pham, Erik Blasch, Zhonghai Wang, and Haibin Ling "Low frame rate video target localization and tracking testbed", Proc. SPIE 8742, Ground/Air Multisensor Interoperability, Integration, and Networking for Persistent ISR IV, 87420T (22 May 2013);

Back to Top