Automatic ground-based in-field cotton (IFC) segmentation is a challenging task in precision agriculture, which has not been well addressed. Nearly all the existing methods rely on hand-crafted features. Their limited discriminative power results in unsatisfactory performance. To address this, a coarse-to-fine cotton segmentation method termed “DeepCotton” is proposed. It contains two modules, fully convolutional network (FCN) stream and interference region removal stream. First, FCN is employed to predict initially coarse map in an end-to-end manner. The convolutional networks involved in FCN guarantee powerful feature description capability, simultaneously, the regression analysis ability of neural network assures segmentation accuracy. To our knowledge, we are the first to introduce deep learning to IFC segmentation. Second, our proposed “UP” algorithm composed of unary brightness transformation and pairwise region comparison is used for obtaining interference map, which is executed to refine the coarse map. The experiments on constructed IFC dataset demonstrate that our method outperforms other state-of-the-art approaches, either in different common scenarios or single/multiple plants. More remarkable, the “UP” algorithm greatly improves the property of the coarse result, with the average amplifications of 2.6%, 2.4% on accuracy and 8.1%, 5.5% on intersection over union for common scenarios and multiple plants, separately.
Rice yield estimation is an important aspect in the agriculture research field. For the rice yield estimation, rice density is
one of its useful factors. In this paper, we propose a new method to automatically detect the rice density from the rice
transplanting stage to rice jointing stage. It devotes to detect rice planting density by image low-level features of the rice
image sequences taken in the fields. Moreover, a rice jointing stage automatic detection method is proposed so as to
terminate the rice density detection algorithm. The validities of the proposed rice density detection method and the rice
jointing stage automatic detection method are proved in the experiment.
In this paper, we propose a specularity-invariant crop extraction method using probabilistic super-pixel markov random field (MRF). Our method is based on the underlying rule that intensity change gradually between highlight areas and its neighboring non-highlight areas. This prior knowledge is embedded into the MRF-MAP framework by modeling the local and mutual evidences of nodes. The marginal probability of each node in the label field is then iteratively computed by Belief Propagation algorithm which leads to the final solution. Comparing experimental results show that our method outperforms the other commonly used extraction methods in yielding highest performance with the lowest standard deviation.
Proc. SPIE. 8918, MIPPR 2013: Automatic Target Recognition and Navigation
KEYWORDS: Image processing algorithms and systems, Agriculture, Detection and tracking algorithms, Image segmentation, Image processing, Digital cameras, Medical imaging, Digital imaging, Meteorology, RGB color model
The automatic observation of the field crop attracts more and more attention recently. The use of image processing technology instead of the existing manual observation method can observe timely and manage consistently. It is the basis that extracting the wheat from the field wheat images. In order to improve accuracy of the wheat segmentation, a novel two-stage wheat image segmentation method is proposed. Training stage adjusts several key thresholds which will be used in segmentation stage to achieve the best segmentation results, and counts these thresholds. Segmentation stage compares the different values of color index to determine which class of each pixel is. To verify the superiority of the proposed algorithm, we compared our method with other crop segmentation methods. Experiment results shows that the proposed method has the best performance.
Cotton, as one of the four major economic crops, is of great significance to the development of the national economy. Monitoring cotton growth status by automatic image-based detection makes sense due to its low-cost, low-labor and the capability of continuous observations. However, little research has been done to improve close observation of different growth stages of field crops using digital cameras. Therefore, algorithms proposed by us were developed to detect the growth information and predict the starting date of cotton automatically. In this paper, we introduce an approach for automatic detecting five true-leaves stage, which is a critical growth stage of cotton. On account of the drawbacks caused by illumination and the complex background, we cannot use the global coverage as the unique standard of judgment. Consequently, we propose a new method to determine the five true-leaves stage through detecting the node number between the main stem and the side stems, based on the agricultural meteorological observation specification. The error of the results between the predicted starting date with the proposed algorithm and artificial observations is restricted to no more than one day.