17 April 2006 Position, rotation, scale, and orientation invariant object tracking from cluttered scenes
Author Affiliations +
Proceedings Volume 6245, Optical Pattern Recognition XVII; 624508 (2006); doi: 10.1117/12.664048
Event: Defense and Security Symposium, 2006, Orlando (Kissimmee), Florida, United States
Abstract
A method of tracking objects in video sequences despite any kind of perspective distortion is demonstrated. Moving objects are initially segmented from the scene using a background subtraction method to minimize the search area of the filter. A variation on the Maximum Average Correlation Height (MACH) filter is used to create invariance to orientation while giving high tolerance to background clutter and noise. A log r-θ mapping is employed to give invariance to in-plane rotation and scale by transforming rotation and scale variations of the target object into vertical and horizontal shifts. The MACH filter is trained on the log r-θ map of the target for a range of orientations and applied sequentially over the regions of movement in successive video frames. Areas of movement producing a strong correlation response indicate an in-class target and can then be used to determine the position, in-plane rotation and scale of the target objects in the scene and track it over successive frames.
© (2006) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Peter Bone, Rupert Young, Chris Chatwin, "Position, rotation, scale, and orientation invariant object tracking from cluttered scenes", Proc. SPIE 6245, Optical Pattern Recognition XVII, 624508 (17 April 2006); doi: 10.1117/12.664048; https://doi.org/10.1117/12.664048
PROCEEDINGS
9 PAGES


SHARE
KEYWORDS
Image filtering

Video

Distortion

Target detection

Filtering (signal processing)

Optical filters

Electronic filtering

Back to Top