Translator Disclaimer
22 September 2020 In-field cotton boll counting based on a deep neural network of density level classification
Author Affiliations +

Development of computer vision technologies has been widely used to increase the level of agricultural intelligence. Crop counting, an application of image counting, plays a fundamental role in agricultural information automation. However, the complex cotton field environment is likely lead to incorrect detection of the target position or fragmentation of the segmentation results, resulting in a decrease in counting accuracy. Despite this, computer vision technologies have shown great potential to effectively solve this task. To solve the problem of multimode cotton boll counting in a complicated environment, an in-field cotton boll counting algorithm based on density classification is proposed. First, the algorithm encodes the global context information with a density level classification estimator. Then, the input images are converted into high-dimensional feature maps by density map estimator with a multicolumn structure. Finally, through the feature fusion neural network, the classification information is combined with high-dimensional feature maps to generate a high-quality density map, and then the cotton bolls are counted. In particular, we collected and labeled a cotton boll counting dataset with 758 high-resolution images for experiment and comparison, which can be divided by different environmental conditions and observations sites. In addition, the relationship of cotton yield against the cotton bolls counting can be assessed by adopting our proposed algorithm. Experimental results demonstrate that the proposed algorithm achieves a lower counting error and better effectiveness and robustness than other comparative algorithms.

© 2020 SPIE and IS&T 1017-9909/2020/$28.00© 2020 SPIE and IS&T
Ziyun Huang, Yanan Li, and Haihui Wang "In-field cotton boll counting based on a deep neural network of density level classification," Journal of Electronic Imaging 29(5), 053009 (22 September 2020).
Received: 12 May 2020; Accepted: 4 September 2020; Published: 22 September 2020

Back to Top