The coconut tree is an important cash crop in coastal tropical regions. These trees undergo substantial damage during annual cyclones. Early and effective assessment of damage is important to farmers for appropriate monetary compensation. However, the identification of coconut trees presents multiple challenges: (i) the trees are with other vegetation and (ii) cloud cover during monsoons and cyclones precludes using simple imaging techniques. Although automated approaches based on classical machine learning techniques have been attempted for other crops on remote sensed images, these are not adequate for identifying coconut trees from different land cover. We present a hybrid approach termed CocoNet that combines (i) unsupervised K-means for pixel-based coarse classification of land cover and (ii) a patch-based convolutional neural network for fine classification of vegetation to identify coconut farms. The challenge presented by cloud cover is addressed using synthetic aperture radar (SAR) images. Change detection is then performed on bitemporal Sentinel-1 SAR images taken before and after a cyclone to obtain the change map depicting the damage caused. Experimental results show that the proposed two-phase CocoNet model gives a classification accuracy of 92.5% and a change detection accuracy of 89.1%. |
ACCESS THE FULL ARTICLE
No SPIE Account? Create one
CITATIONS
Cited by 2 scholarly publications.
Synthetic aperture radar
Machine learning
Backscatter
Agriculture
Buildings
Expectation maximization algorithms
Scattering