The use of remote sensing imagery has become indispensable for long-term crop mapping. Synthetic aperture radar can provide high-resolution polarization information on crop time-series canopy changes, making large-scale, long-time-series crop mapping possible. This study presents a method for precise crop mapping using Sentinel-1 data time series from 2015 to 2020. The proposed method, known as the temporal-spatial fusion U-Net (TSFUNet) model, makes use of the temporal characteristics of the time series to improve crop mapping precision. The model consists of a temporal channel fusion module for combining the temporal characteristics of multiple channels. In addition, an improved U-Net with a cascade feature fusion module is introduced to fuse spatial contextual information at different scales for the generation of final crop segmentation. Experimental results demonstrate that the TSFUNet model for temporal Sentinel-1 sequence-based crop mapping outperforms other state-of-the-art methods. For crop mapping at different times, a multi-stage learning (MSL) training strategy is proposed to improve the model’s generalization ability. In addition, the chaotic-Jaya algorithm is employed to optimize the selection of learning rate hyperparameters in MSL and further enhance the model’s high-accuracy crop mapping performance. |
ACCESS THE FULL ARTICLE
No SPIE Account? Create one
Education and training
Image segmentation
Machine learning
Remote sensing
Synthetic aperture radar
Data modeling
Semantics