9 December 2015 Robust visual tracking with dual spatio-temporal context trackers
Author Affiliations +
Proceedings Volume 9817, Seventh International Conference on Graphic and Image Processing (ICGIP 2015); 98170X (2015) https://doi.org/10.1117/12.2228015
Event: Seventh International Conference on Graphic and Image Processing, 2015, Singapore, Singapore
Abstract
Visual tracking is a challenging problem in computer vision. Recent years, significant numbers of trackers have been proposed. Among these trackers, tracking with dense spatio-temporal context has been proved to be an efficient and accurate method. Other than trackers with online trained classifier that struggle to meet the requirement of real-time tracking task, a tracker with spatio-temporal context can run at hundreds of frames per second with Fast Fourier Transform (FFT). Nevertheless, the performance of the tracker with Spatio-temporal context relies heavily on the learning rate of the context, which restricts the robustness of the tracker.

In this paper, we proposed a tracking method with dual spatio-temporal context trackers that hold different learning rate during tracking. The tracker with high learning rate could track the target smoothly when the appearance of target changes, while the tracker with low learning rate could percepts the occlusion occurring and continues to track when the target starts to emerge again. To find the target among the candidates from these two trackers, we adopt Normalized Correlation Coefficient (NCC) to evaluate the confidence of each sample. Experimental results show that the proposed algorithm performs robustly against several state-of-the-art tracking methods.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Shiyan Sun, Hong Zhang, Ding Yuan, "Robust visual tracking with dual spatio-temporal context trackers ", Proc. SPIE 9817, Seventh International Conference on Graphic and Image Processing (ICGIP 2015), 98170X (9 December 2015); doi: 10.1117/12.2228015; https://doi.org/10.1117/12.2228015
PROCEEDINGS
5 PAGES


SHARE
RELATED CONTENT


Back to Top