Feature points and object edges are two kinds of primitives which are frequently used in target tracking algorithms.
Feature points can be easily localized in an image. Their correspondences between images can be detected accurately.
They can adapt to wide baseline transformations. However, feature points are not so stable that they are fragile to
changes in illumination and viewpoint. On the contrary, object edges are stable under a very wide range of illumination
and viewpoint changes. Unfortunately, edge-based algorithms often fail in the presence of highly textured targets and
clutter which produce too many irrelevant edges. We found that both edge-based and point-based tracking have failure
modes which are complementary. Based on this analysis, we propose a novel tracking algorithm which fuses point and
edge features. Our tracking algorithm uses feature points matching to track object first, and then uses the transformation
parameters archived in the first step to initialize the edge tracking. By this means, our algorithm alleviates the
disturbance of irrelevant edges. Then, we use the texture boundary detection algorithm to find the precise object
boundary. Texture boundary detection is different from the conventional gradient-based edge detection which can
directly compute the most probable location of a texture boundary on the search line. Therefore, it is very fast and can be
incorporated into a real-time tracking algorithm. Experimental results show that our tracking algorithm has outstanding
tracking accuracy and robustness.