14 December 2015 Effective 3D infrared image target tracking via framework of particle filter and FAST
Author Affiliations +
Proceedings Volume 9812, MIPPR 2015: Automatic Target Recognition and Navigation; 981212 (2015) https://doi.org/10.1117/12.2209628
Event: Ninth International Symposium on Multispectral Image Processing and Pattern Recognition (MIPPR2015), 2015, Enshi, China
The 3D target tracking based on the infrared image is a challenge problem in computer vision. With the development and progress of the target tracking in recent years, key point-based target tracking is widely applied in many fields, such as augmented reality, object retrieval, human computer interaction and medical imaging. Generally, the aim of target tracking is to locate the targets in the sub-sequent frames. Because the tracking is a fundamental research topic, it should be self-adaptive under some special situations,, such as illumination variation, occlusion, background clutters and so on. Thereby it is hard to fit all scenarios by single tracking approach. And in a particle filter framework, computational cost grows linearly with the number of particles sampled. Hence, it is important to decrease computational cost but also get good performance. In the process of the tracking 3D target, traditional tracking algorithms, such as template matching using fast normalized cross correlation or HOG (Histogram of Oriented Gradients), perform not so well. The main reasons are that they are not scale and rotation invariant matching. Mean shift tracking algorithm takes advantage of a probability density to search for the position of the target, which fails in abrupt changes of luminance. Furthermore, the feature selection is important in obtaining high accuracy. So the computational cost may become a bottleneck in tracking problem. This paper proposes an effective tracking framework based on FAST BRIEF and Particle Filter (FBPF). Our approach can be divided into four steps as follows:(1) An advanced fast algorithm which includes decision trees is adopted to extract key points. (2) The BRIEF descriptor is applied to speed up the feature matching.(3)Calculate the affine transformation parameters. (4)Particle filter is adopted to predict the best position of target. The key point of traditional FAST algorithm is defined as point which has different gray value with enough surrounding constant pixels. Applied in the gray images, it means there are enough and constant pixels with more or less gray value than itself. The circle area is usually selected when taking any pixel in the image and its round area. In order to decrease the cost of calculation, we employ an advanced FAST based on decision tree, which learns the key points with the maximum entropy. Based on the binary descriptor BRIEF, the original transformation matrix can be calculated efficiently. Then, in order to estimate the best position of the target, particle filter is applied. We take the parameters as particles. It provides a convenient framework for estimating and propagating the posterior probability density function of state variables regardless of the underlying distribution through a sequence of prediction and update steps. Experimental results demonstrate that FBPF achieves state-of-the-art performance on the challenge sequences. In this paper, the main contribution is the proposal of a new tracking framework which based on FAST BRIEF and Particle Filter (FBPF). First, we employ an advanced FAST using the decision tree and BRIEF is used to establish feature for decreasing time consumption. Second, we calculate the original affine matrix base on the feature matching. Finally, the particle filter is utilized to predict the best position.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Zhengzheng Wu, Zhiguo Cao, Yang Xiao, "Effective 3D infrared image target tracking via framework of particle filter and FAST", Proc. SPIE 9812, MIPPR 2015: Automatic Target Recognition and Navigation, 981212 (14 December 2015); doi: 10.1117/12.2209628; https://doi.org/10.1117/12.2209628

Back to Top