10 May 2012 Content-dependent on-the-fly visual information fusion for battlefield scenarios
Author Affiliations +
Abstract
We report on cooperative research program between Army Research Laboratory (ARL), Night Vision and Electronic Sensors Directorate (NVESD), and University of Maryland (UMD). The program aims to develop advanced on-the-fly atmospheric image processing techniques based on local information fusion from a single or multiple monochrome and color live video streams captured by imaging sensors in combat or reconnaissance situations. Local information fusion can be based on various local metrics including local image quality, local image-area motion, spatio-temporal characteristics of image content, etc. Tools developed in this program are used to identify and fuse critical information to enhance target identification and situational understanding in conditions of severe atmospheric turbulence.
© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Mathieu Aubailly, Mathieu Aubailly, Mikhail A. Vorontsov, Mikhail A. Vorontsov, Gary Carhart, Gary Carhart, J. Jiang Liu, J. Jiang Liu, Richard Espinola, Richard Espinola, } "Content-dependent on-the-fly visual information fusion for battlefield scenarios", Proc. SPIE 8368, Photonic Applications for Aerospace, Transportation, and Harsh Environment III, 83680J (10 May 2012); doi: 10.1117/12.918681; https://doi.org/10.1117/12.918681
PROCEEDINGS
6 PAGES


SHARE
RELATED CONTENT

Curvelet based hyperspectral image fusion
Proceedings of SPIE (August 29 2013)
Color image saturation enhancement
Proceedings of SPIE (December 27 2000)
Study on the visual algorithm for JPEG HDR remote sensing...
Proceedings of SPIE (November 09 2008)

Back to Top