23 September 2003 Adaptive recognition under static and dynamic environment assumptions
Author Affiliations +
Abstract
This paper presents two approaches to ATR* by trainable algorithms. The first approach assumes that the measurements coming from the objects remain unchanged for the time passed between the stages of learning and recognition. For outdoor scenes such an approach is viable when both learning and recognition can be completed within minutes, which is difficult to achieve in practice. More realistic is to acquire training image data short before surveying the scene of interest. Then computer-intensive or interactive learning algorithms can be applied. We exemplify this approach qualitatively by detecting buildings and asphalt roads in a typical urban scene from AISA hyperspectral sensor data. The second, new approach we derive takes into account the joint changes of all targets and backgrounds under dynamic external factors. This requires multitemporally surveying an area that is specially selected for training an ATR system. Then at the future recognition stage the system can take advantage of the learning results in the real-time mode. Experimental verification of the new approach was performed using a fixed FLIR-type camera that surveyed the site containing more than 50 thermally different objects, whereas learning and recognition were spaced one week apart. The thermal joint prediction model proved working and was applied for detecting and identifying a scene anomaly -- an intruder.
© (2003) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Alexander Koltunov, Alexander Koltunov, Joseph Koltunov, Joseph Koltunov, Eyal Ben-Dor, Eyal Ben-Dor, } "Adaptive recognition under static and dynamic environment assumptions", Proc. SPIE 5093, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery IX, (23 September 2003); doi: 10.1117/12.497020; https://doi.org/10.1117/12.497020
PROCEEDINGS
12 PAGES


SHARE
Back to Top