Translator Disclaimer
1 July 1991 Atmospheric propagation effects on pattern recognition by neural networks
Author Affiliations +
Smart electro-optical systems of the future will need to be adaptive and robust to function in different environments. In 1989 the authors reported how atmospheric losses in contrast, resolution, edge detail, and signal to noise adversely affect image-based classification using linear matched filters and how the atmosphere alters features such as gray-level moments. They also showed that the performance changes with atmospheric path radiance and transmittance are predictable, however, and that some effects can be mitigated automatically by including the atmosphere as a separate training class. This paper extends that analysis to atmospheric effects on pattern recognition by neural network classifiers. The neural net pattern recognition methods considered here are single- and multi-layer perceptron networks trained with back-propagation. Image classifier performance under different atmospheric propagation conditions is shown to be easily predicted for simple single-layer neural nets. This leads to a specific training strategy to minimize the impact of propagation losses by including the atmosphere as a separate training class. This same strategy also improves the performance of multi-layer neural networks. Examples are given of classification of a vehicle partly obscured by highly scattering white smoke and highly absorptive black smoke. Other methods are being investigated that affect the performance and training convergence properties of neural net pattern recognition in atmospheres.
© (1991) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
John C. Giever and Donald W. Hoock Jr. "Atmospheric propagation effects on pattern recognition by neural networks", Proc. SPIE 1486, Characterization, Propagation, and Simulation of Sources and Backgrounds, (1 July 1991);

Back to Top