Weeds significantly impact agriculture, leading to substantial global crop losses and increased production costs. Conventional weed management methods, such as manual labor and herbicide use, are often labor-intensive and environmentally concerning. This paper explores the integration of unmanned aerial vehicles (UAVs) and advanced machine learning techniques, with a specific focus on deep learning and computer vision, for the precise detection and localization of weeds. The research involves the ongoing collection of a comprehensive dataset of weed images from a strawberry field over the entire growing season. Machine learning models, including YOLO, Faster R-CNN, and SSD object detection, are currently being trained on this dataset to accurately detect and localize weeds from a UAV altitude of 10 meters. The validation of these models will be conducted using both the same strawberry fields with new batches of transplantations, ensuring the robustness and generalizability of the detection techniques. The effectiveness of each model is compared with each other to highlight the inherent strengths and weaknesses of each in an agricultural setting. The identified weeds are localized using a UAV in real-time, and explore monocular and stereo vision as imaging techniques. This integrated approach holds the potential for several advantages that include reducing the cost of production, minimizing human exposure to harmful chemicals, and decreasing the reliance on manual labor for weed management. The significance of this ongoing research lies in its potential to revolutionize weed management by providing a reliable and efficient method for weed detection and localization. The study aims to contribute empirical evidence and data, bridging the gap between theoretical frameworks and practical implementation in precision agriculture.
|