Paper
3 October 2024 Spatiotemporal satellite image fusion using nanosatellite data
Yeji Kim, Hyun Ok Kim
Author Affiliations +
Abstract
As Earth observation satellite data grows, the need for higher temporal and spatial resolutions becomes crucial for accurate monitoring and decision-making. Achieving both of high temporal and spatial resolutions is challenging due to trade-offs in sensor design; for instance, Sentinel-2 and Landsat offer higher temporal but lower spatial resolution, while high-resolution sensors like NEONSAT with small coverages. This study introduces a deep learning-based spatiotemporal image fusion method that integrates multi-sensor data, combining low and high spatial resolution images from different sensors over time. The method estimates adjustment features from temporal and spatial differences, using fusion and convolutional blocks to enhance resolution. Trained on Sentinel-2 and Planet images, the method effectively maintains spectral integrity and enhances spatial details under varying conditions. By leveraging multi-sensor data, this approach addresses sensor quality and stability issues, expanding NEONSAT’s potential applications. Future research will refine the method by incorporating more datasets, including NEONSAT imagery, to advance spatiotemporal fusion techniques.
© (2024) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yeji Kim and Hyun Ok Kim "Spatiotemporal satellite image fusion using nanosatellite data", Proc. SPIE 13143, Earth Observing Systems XXIX, 131431A (3 October 2024); https://doi.org/10.1117/12.3029007
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Spatial resolution

Satellites

Image sensors

Earth observing sensors

Sensors

Temporal resolution

RELATED CONTENT


Back to Top