|
1.INTRODUCTIONPrecision Agriculture (PA) has gained global significance for optimizing natural resource utilization and crop yield while reducing losses and waste.1 PA involves data collection to quantify spatial and temporal variations in agricultural units, serving as a site-specific management strategy employing information technologies to aid crucial decisions in crop production.2 Recent advances in Internet of Things (IoT) and Machine Learning are enhancing PA’s accuracy, increasing benefits in quantity and quality of production, reducing farmers’ costs and contributing to a more sustainable agriculture.3 A major challenge in modern agriculture is weed control, where undesired plants compete with crops for resources like water, light, nutrients and growth space.4 Herbicide resistance, often due to continuous chemical applications, leads to environmental harm and threatens ecosystem species. The rise of resistant weeds has drawn attention from farmers and specialists due to its causes related to herbicide application practices and genetic variation.5 These factors contribute to the need for more precise methods such as localized input applications and advanced PA technologies to address these issues. Farmers’ interest in herbicide selective spraying systems is becoming high as these equipments rise as an efficient, affordable, sustainable and rentable technology.6 While most available systems use chlorophyll detectors for weed detection, these methods are limited in distinguishing crops from weeds in pre-emergence herbicide applications, since this technique detects green (weeds) on a fallow ground. Computer Vision has emerged as a promising solution for image-based weed detection and recognition among crops for selective spraying herbicides in both pre and pos-emergence. YOLO (”You Only Look Once”) algorithm, known for its real-time object detection capabilities, has been widely used in weed recognition tasks, showing success across various applications and different weed species.7–12 Multispectral imaging consists of capturing images of the same scene using different wavelengths. It has been widely used for remote sensing for productivity areas mapping,13 weed mapping14 plant density15 and plant disease detection and diagnosis.16 For detection tasks in artificial and natural lighting conditions, plants spectral signature plays an important role since different light wavelengths are reflected in distinct intensities and ways by plants’ leaf structure.17 In this paper, it is described the use of multispectral images for in crop weed recognition using YOLO algorithm and an artificial lighting environment. Three important weed species were used (Amaranthus viridis L., Bidens pilosa L. and Digitaria horizontalis wild) and soybean (Glycine max L.) was chosen as a crop plant due to its economic importance in Brazil’s agriculture. Plants were grown in an indoor greenhouse and a dataset containing 3, 775 images was built using a multispectral camera system containing five cameras: RGB, green (G), red (R), near infrared (NIR) and infrared (IR). 2.METHODSThe experiments were conducted within an indoor greenhouse laboratory located at the São Carlos Institute of Physics from the University of São Paulo, ensuring controlled conditions with a temperature of 25 °C and a regulated photoperiod of 12 hours of light followed by 12 hours of darkness. The indoor greenhouse was equipped with ten LED lamps designed for plant growth and two white ceiling LED lights that offer visible spectrum illumination, along with ten halogen lamps that provide near-infrared spectrum lighting. The cumulative spectrum produced for both plant growth and image capture was assessed using a spectrometer from Ocean Optics (Ocean Optics, USA) and is presented in Figure 1. For purpose of automating the image acquisition and emulating the motion of a spray tractor navigating through crop planting rows, a v-slot rail system was built using a wooden frame (see Figure 2). The architecture encompasses two v-slot rails affixed with cameras, a NEMA 17 stepper motor connected via GT2 belts and pulleys, and a CNC Shield housing an Arduino Nano to regulate the NEMA 17 motor’s steps, thereby orchestrating the camera system’s motion. The camera arrangement comprises four monochromatic CMOS sensor cameras (ELP - OSMO B/W, China) and a colored RGB CMOS sensor camera (ELP-USBFHD01M, China), each equipped with a 6 mm focal distance lens. Both camera models feature 2 megapixels OV2710 CMOS sensors (1920 × 1080 pixels). Communication between the cameras and a desktop computer occurred through the USB 2.0 protocol. Three monochromatic bandpass filters designed to match green (G: 501 – 525 nm), red (R: 654 – 674 nm), and near-infrared (NIR: 761 – 829 nm) wavelengths were placed over the lenses of three out of four monochromatic cameras, exclusively allowing light of predetermined wavelengths to reach each sensor. The fourth monochromatic camera employed an infrared longpass filter (IR: > 780 nm) on its lens. The RGB camera didn’t use additional filters, maintaining only the factory KG1 filter to obstruct infrared light from reaching the sensor. The camera support structure was designed using SolidWorks software (SolidWorks Corporation, USA) and produced via an Ender 3D printer (Ender, China) to ensure alignment of all five cameras in the same direction as the system’s movement. Figure 3 shows the arrangement of the cameras on the support. To facilitate plant growth, two trays filled with commercial soil were situated directly beneath the illumination bench and the v-slot rail system. Soybean plants were meticulously sown in two parallel rows, while weed plants were randomly distributed across the cultivation trays. Throughout the course of a month-long experiment, a total of approximately 3, 775 images were acquired, with each camera contributing 755 images. These images were labeled utilizing the bounding box technique in the Computer Vision Annotation Tool (CVAT) software and they were labeled into three classes: soybean (Glycine max (L.) Merrill plants), weed (broadleaf weeds of the Amaranthus viridis L. and Bidens pilosa L. species), and grass (grassy weeds of the Digitaria horizontalis wild species). Figure 4 shows examples of images captured in the different bands. The YOLO algorithm was employed, maintaining its fundamental architecture, for each of the individual spectral camera images as well as the RGB images. The dataset was splitted into three subsets: 70% of images for training, 20% for validation and 10% for testing, keeping the same amount of images for each band. Training was executed over 3, 000 epochs, using early stopping technique with patience value set to 100. To evaluate the performance of the different models, the following metrics were used: precision; recall; mAP(0.5); and mAP(0.5:0.95). Precision is a measure of how accurate the model is in classifying; recall calculates how many actual positives the model captures through labeling it is a true positive; and mAP is a metric that incorporates a trade-off between precision and recall. 3.RESULTSThe five models were evaluated using the same 375 images (75 for each band and for the RGB images). Table 1 summarizes the results, presenting the metrics for each model. Table 1:Results from tests performed for images in different bands and RGB.
Experimental results indicate best performance for IR band with precision of 90.5%, recall of 89.3%, mAP(0.5) of 92.8% and mAP(0.5:0.95) of 72.5%. Figure 5 presents examples of detection and classification weeds and soybean in the different bands. 4.CONCLUSIONIn this paper, the development of a multispectral camera and a v-slot rail system to capture images of plants in an indoor greenhouse with artificial lighting is described. A multispectral image dataset consisting of 3, 775 images of weeds and soybean plants was assembled using four monochromatic bands (G, R, NIR, and IR) and an RGB camera. The YOLO algorithm was employed to conduct weed detection among soybean plants utilizing the five types of acquired images. Experimental results reveal that the longpass infrared band achieved superior precision and recall values (0.905 and 0.893, respectively) followed by RGB (0.870 and 0.861, respectively) and the near-infrared band (0.836 and 0.879, respectively), demonstrating a good performance of infrared wavelengths for weed recognition within crop settings. Furthermore, it was demonstrated that Computer Vision offers a promising avenue for addressing post-emergence herbicide applications, given its ability to differentiate between crop plants and weeds. ACKNOWLEDGMENTSSupported by grants #2022/15892-3, PIPE/FAPESP 2022/06153-2, São Paulo Research Foundation (FAPESP) and CAPES/PROEX 88887.608664/2021-00. REFERENCESEli-Chukwu, N. C.,
“Applications of artificial intelligence in agriculture: A review.,”
Engineering, Technology & Applied Science Research, 9
(4),
(2019). Google Scholar
Cisternas, I., Velásquez, I., Caro, A., and Rodríguez, A.,
“Systematic literature review of implementations of precision agriculture,”
Computers and Electronics in Agriculture, 176 105626
(2020). https://doi.org/10.1016/j.compag.2020.105626 Google Scholar
Akhter, R. and Sofi, S. A.,
“Precision agriculture using iot data analytics and machine learning,”
Journal of King Saud University-Computer and Information Sciences, 34
(8), 5602
–5618
(2022). https://doi.org/10.1016/j.jksuci.2021.05.013 Google Scholar
Islam, N., Rashid, M. M., Wibowo, S., Xu, C.-Y., Morshed, A., Wasimi, S. A., Moore, S., and Rahman, S. M.,
“Early weed detection using image processing and machine learning techniques in an australian chilli farm,”
Agriculture, 11
(5), 387
(2021). https://doi.org/10.3390/agriculture11050387 Google Scholar
Gaines, T. A., Busi, R., and Küpper, A.,
“Can new herbicide discovery allow weed management to outpace resistance evolution?,”
Pest Management Science, 77
(7), 3036
–3041
(2021). https://doi.org/10.1002/ps.v77.7 Google Scholar
Liu, B. and Bruch, R.,
“Weed detection for selective spraying: a review; current robotics reports,”
(2020). Google Scholar
Wang, Q., Cheng, M., Huang, S., Cai, Z., Zhang, J., and Yuan, H.,
“A deep learning approach incorporating yolo v5 and attention mechanisms for field real-time detection of the invasive weed solanum rostratum dunal seedlings,”
Computers and Electronics in Agriculture, 199 107194
(2022). https://doi.org/10.1016/j.compag.2022.107194 Google Scholar
Osorio, K., Puerto, A., Pedraza, C., Jamaica, D., and Rodríguez, L.,
“A deep learning approach for weed detection in lettuce crops using multispectral images,”
AgriEngineering, 2
(3), 471
–488
(2020). https://doi.org/10.3390/agriengineering2030032 Google Scholar
Ying, B., Xu, Y., Zhang, S., Shi, Y., and Liu, L.,
“Weed detection in images of carrot fields based on improved yolo v4.,”
Traitement du Signal, 38
(2),
(2021). https://doi.org/10.18280/ts Google Scholar
Chen, J., Wang, H., Zhang, H., Luo, T., Wei, D., Long, T., and Wang, Z.,
“Weed detection in sesame fields using a yolo model with an enhanced attention mechanism and feature fusion,”
Computers and Electronics in Agriculture, 202 107412
(2022). https://doi.org/10.1016/j.compag.2022.107412 Google Scholar
Hu, D., Ma, C., Tian, Z., Shen, G., and Li, L.,
“Rice weed detection method on yolov4 convolutional neural network,”
in [2021 international conference on artificial intelligence, big data and algorithms (CAIBDA),
41
–45
(2021). Google Scholar
Barnhart, I. H., Lancaster, S., Goodin, D., Spotanski, J., and Dille, J. A.,
“Use of open-source object detection algorithms to detect palmer amaranth (amaranthus palmeri) in soybean,”
Weed Science, 70
(6), 648
–662
(2022). https://doi.org/10.1017/wsc.2022.53 Google Scholar
Fang, P., Yan, N., Wei, P., Zhao, Y., and Zhang, X.,
“Aboveground biomass mapping of crops supported by improved casa model and sentinel-2 multispectral imagery,”
Remote Sensing, 13
(14), 2755
(2021). https://doi.org/10.3390/rs13142755 Google Scholar
Sa, I., Chen, Z., Popović, M., Khanna, R., Liebisch, F., Nieto, J., and Siegwart, R.,
“weednet: Dense semantic weed classification using multispectral images and mav for smart farming,”
IEEE robotics and automation letters, 3
(1), 588
–595
(2017). https://doi.org/10.1109/LRA.2017.2774979 Google Scholar
Wilke, N., Siegmann, B., Postma, J. A., Muller, O., Krieger, V., Pude, R., and Rascher, U.,
“Assessment of plant density for barley and wheat using uav multispectral imagery for high-throughput field phenotyping,”
Computers and Electronics in Agriculture, 189 106380
(2021). https://doi.org/10.1016/j.compag.2021.106380 Google Scholar
Karpyshev, P., Ilin, V., Kalinov, I., Petrovsky, A., and Tsetserukou, D.,
“Autonomous mobile robot for apple plant disease detection based on cnn and multi-spectral vision system,”
in 2021 IEEE/SICE international symposium on system integration (SII)],
157
–162
(2021). Google Scholar
Sridhar, B. M., Han, F., Diehl, S., Monts, D., and Su, Y.,
“Spectral reflectance and leaf internal structure changes of barley plants due to phytoextraction of zinc and cadmium,”
International journal of remote sensing, 28
(5), 1041
–1054
(2007). https://doi.org/10.1080/01431160500075832 Google Scholar
|