Translator Disclaimer
24 November 2014 A color fusion method of infrared and low-light-level images based on visual perception
Author Affiliations +
Proceedings Volume 9301, International Symposium on Optoelectronic Technology and Application 2014: Image Processing and Pattern Recognition; 93013E (2014) https://doi.org/10.1117/12.2073183
Event: International Symposium on Optoelectronic Technology and Application 2014, 2014, Beijing, China
Abstract
The color fusion images can be obtained through the fusion of infrared and low-light-level images, which will contain both the information of the two. The fusion images can help observers to understand the multichannel images comprehensively. However, simple fusion may lose the target information due to inconspicuous targets in long-distance infrared and low-light-level images; and if targets extraction is adopted blindly, the perception of the scene information will be affected seriously. To solve this problem, a new fusion method based on visual perception is proposed in this paper. The extraction of the visual targets (“what” information) and parallel processing mechanism are applied in traditional color fusion methods. The infrared and low-light-level color fusion images are achieved based on efficient typical targets learning. Experimental results show the effectiveness of the proposed method. The fusion images achieved by our algorithm can not only improve the detection rate of targets, but also get rich natural information of the scenes.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jing Han, Minmin Yan, Yi Zhang, and Lianfa Bai "A color fusion method of infrared and low-light-level images based on visual perception", Proc. SPIE 9301, International Symposium on Optoelectronic Technology and Application 2014: Image Processing and Pattern Recognition, 93013E (24 November 2014); https://doi.org/10.1117/12.2073183
PROCEEDINGS
6 PAGES


SHARE
Advertisement
Advertisement
Back to Top