1 August 2004 Data fusion of hyperspectral and SAR images
Author Affiliations +
Optical Engineering, 43(8), (2004). doi:10.1117/1.1768535
Abstract
A novel technique is proposed for data fusion of earth remote sensing. The method is developed for land cover classification based on fusion of remote sensing images of the same scene collected from multiple sources. It presents a framework for fusion of multisource remote sensing images, which consists of two algorithms, referred to as the greedy modular eigenspace (GME) and the feature scale uniformity transformation (FSUT). The GME method is designed to extract features by a simple and efficient GME feature module, while the FSUT is performed to fuse most correlated features from different data sources. Finally, an optimal positive Boolean function based multiclass classifier is further developed for classification. It utilizes the positive and negative sample learning ability of the minimum classification error criteria to improve classification accuracy. The performance of the proposed method is evaluated by fusing MODIS/ASTER airborne simulator (MASTER) images and the airborne synthetic aperture radar (SAR) images for land cover classification during the PacRim II campaign. Experimental results demonstrate that the proposed fusion approach is an effective method for land cover classification in earth remote sensing, and improves the precision of image classification significantly compared to conventional single source classification.
Yang-Lang Chang, Chin-Chuan Han, Hsuan Ren, Chia-Tang Chen, Kun-Shan Chen, Kuo-Chin Fan, "Data fusion of hyperspectral and SAR images," Optical Engineering 43(8), (1 August 2004). https://doi.org/10.1117/1.1768535
JOURNAL ARTICLE
11 PAGES


SHARE
Back to Top