Translator Disclaimer
Presentation + Paper
24 October 2016 Multiscale image fusion through guided filtering
Author Affiliations +
We introduce a multiscale image fusion scheme based on guided filtering. Guided filtering can effectively reduce noise while preserving detail boundaries. When applied in an iterative mode, guided filtering selectively eliminates small scale details while restoring larger scale edges. The proposed multi-scale image fusion scheme achieves optimal spatial consistency by using guided filtering both at the decomposition and at the recombination stage of the multiscale fusion process. First, size-selective iterative guided filtering is applied to decompose the source images into base and detail layers at multiple levels of resolution. Then, frequency-tuned filtering is used to compute saliency maps at successive levels of resolution. Next, at each resolution level a binary weighting map is obtained as the pixelwise maximum of corresponding source saliency maps. Guided filtering of the binary weighting maps with their corresponding source images as guidance images serves to reduce noise and to restore spatial consistency. The final fused image is obtained as the weighted recombination of the individual detail layers and the mean of the lowest resolution base layers. Application to multiband visual (intensified) and thermal infrared imagery demonstrates that the proposed method obtains state-ofthe- art performance for the fusion of multispectral nightvision images. The method has a simple implementation and is computationally efficient.
Conference Presentation
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Alexander Toet and Maarten A. Hogervorst "Multiscale image fusion through guided filtering", Proc. SPIE 9997, Target and Background Signatures II, 99970J (24 October 2016);

Back to Top