31 May 2013 Transforming optical image data into a SAR system's range-based image space
Author Affiliations +
Abstract
The fusion of image data from different sensor types is an important processing step for many remote sensing applications to maximize information retrieval from a given area of interest. The basic process to fuse image data is to select a common coordinate system and resample the data to this new image space. Usually, this is done by orthorectifying all those different image spaces, which means a transformation of the image’s projection plane to a geographic coordinate system. Unfortunately, the resampling of the slant-range based image space of a space borne synthetic aperture radar (SAR) to such a coordinate system strongly distorts its content and therefore reduces the amount of extractable information. The understanding of the complex signatures, which are already hard to interpret in the original data, even gets worse. To preserve maximum information extraction, this paper shows an approach to transform optical images into the radar image space. This can be accomplished by using an optical image along with a digital elevation model and project it to the same slant-range image plane as the one from the radar image acquisition. This whole process will be shown in detail for practical examples.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
H. Anglberger, H. Anglberger, R. Speck, R. Speck, H. Suess, H. Suess, "Transforming optical image data into a SAR system's range-based image space", Proc. SPIE 8714, Radar Sensor Technology XVII, 871411 (31 May 2013); doi: 10.1117/12.2018395; https://doi.org/10.1117/12.2018395
PROCEEDINGS
8 PAGES


SHARE
Back to Top