Often sensor ego-motion or fast target movement causes the target to temporarily go out of the field-of-view leading to reappearing target detection problem in target tracking applications. Since the target goes out of the current frame and re-enters at a later frame, the re-entering location and variations in rotation, scale, and other three-dimensional orientations of the target are not known, thus complicating the detection and tracking of reappearing targets. A new training-based target detection algorithm has been developed using tuned basis functions (TBFs). The detection algorithm uses target and background information, extracted from training samples, to detect possible candidate target images. The detected candidate target images are then introduced into the second algorithm, called clutter rejection module, to determine the target re-entering frame and location of the target. The second algorithm has been designed using the spatial domain correlation-based template matching (TM) technique. If the target re-enters the current frame, the target coordinates are detected and tracking algorithm is initiated. The performance of the proposed TBF-TM-based reappearing target detection algorithm has been tested using real-world forward-looking infrared video sequences.