How to improve optical target tracking system's ability to reject clutter and noise is still a subject deserving intensive research. This paper presents a new target tracking method that exploits the subareas' resemblance between frames of image sequences. At proper sampling rate, the target's projections on the focus plane of the optical/IR sensor change slightly. By computing of the subareas of the images and searching of the location where the maximum correlation value occurs, we can estimate out the shifts of target position between frames. This is the way the traditional match filter tracker (MFT) works. In the new method, a mapping domain is generated from which we locate the targets' new positions by simply selecting a maximum value. Although more calculation power of the processor of the tracker is required, the new target tracking method is promising that it is more robust under strong clutter environment and allows the panning and rolling of the sensor (camera). A series of experiments is carried to compare the performance of the new target tracking method with that of MFT. The method to get the mapping domain is described in detail in this paper. Experiment results are also presented in this paper that demonstrates that the tracker compared with the traditional MFT are more robust when the scene is unstable.