28 June 2019 Autocorrelated correlation filter for visual tracking
Weichun Liu, Dongdong Li, Xiaoan Tang
Author Affiliations +
Abstract
Recently, discriminative correlation filters (DCF) have achieved enormous popularity in the tracking community due to high efficiency and fair robustness. With a circular structure, DCFs transform computationally consuming spatial correlation into efficient element-wise operation in the Fourier domain. We argue that this element-wise solution can be derived only in the case of single-channel features. In terms of tracking with multichannel features, this element-wise solution trains each feature dimension independently and fails to learn a joint correlation filter. To tackle this problem, we propose a rigorous solution to closed-form correlation filter tracking. This rigorous solution can be computed pixel by pixel from a small linear equation system. Experimental results demonstrate that our rigorous pixel-wise solution achieves better tracking performance than the baseline element-wise solution.
© 2019 SPIE and IS&T 1017-9909/2019/$25.00 © 2019 SPIE and IS&T
Weichun Liu, Dongdong Li, and Xiaoan Tang "Autocorrelated correlation filter for visual tracking," Journal of Electronic Imaging 28(3), 033038 (28 June 2019). https://doi.org/10.1117/1.JEI.28.3.033038
Received: 8 March 2019; Accepted: 7 June 2019; Published: 28 June 2019
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image filtering

Electronic filtering

Optical tracking

Lithium

Computing systems

Linear filtering

Video

RELATED CONTENT

Target tracking: method and comparison
Proceedings of SPIE (March 18 2022)
Previous observation regularized tracker
Proceedings of SPIE (June 12 2020)

Back to Top