Commercial off-the shelf systems of UAVs and sensors are touted as being able to collect remote-sensing data on crops that include spectral reflectance and plant height. Historically a great deal of effort has gone into quantifying and reducing the error levels in the geometry of UAV-based orthomosaics, but little effort has gone into quantifying and reducing the error of the reflectance and plant-height. We have been developing systems and protocols involving multifunctional ground-control points (GCPs) in order to produce crop phenotypic data that are as repeatable as possible. These multifunctional GCPs aid not only geometric correction, but also image calibration of reflectance and plantheight. The GCPs have known spectral-reflectance characteristics that are used to enable reference-based digital numberto-reflectance calibration of multispectral images. They also have known platform heights that are used to enable reference-based digital surface model-to-height maps. Results show that using these GCPs for reflectance and plantheight calibrations significantly reduces the error levels in reflectance (ca. 50% reduction) and plant-height (ca. 20% reduction) measurements.
Unmanned Aerial System (UAS) is becoming a popular choice when acquiring fine spatial resolution images for precision agriculture applications. Compared to other remote sensing data collection platforms, UAS can acquire image data at relatively lower cost with finer spatial resolution with more flexible schedule. In recent years, multispectral sensors that can capture near infrared (NIR) and red edge spectral reflectance have been successfully integrated with UAS, and it is offering more versatility in soil and field analysis, crop monitoring, and plant health assessment. In this study, we aim to investigate the capability of UAS-based crop monitoring system to determine the best management practices for 3 different tomato varieties comparing different planting dates, plant density, use of plastic mulch and fertilization rate. The field and UAS data were acquired during Spring 2016, 2017, and 2018 located in Weslaco, TX. To compare the effect of various treatments in cropping systems, physiological parameters and vegetation indexes (Canopy Cover, Canopy Height, Canopy Volume and Excess Greenness) were extracted from red, green, blue (RGB) data and correlated with final yield data to evaluate practice/treatment to maximize tomato yield. During Spring 2016, we observed highest yield from the early March planting date using white plastic mulch. The results also indicated that the variety yielded higher presented a slow canopy decay towards the end of the season. In Spring 2017, there were differences in yield among the three tomato varieties depending on the fertilization rate, DRP-8551 performed better at low nitrogen level, Mykonos performed better on the two higher nitrogen rates and TAM-Hot-Ty had no significant difference among treatments. Finally, during Spring 2018, it was observed that early March produced the best yields and varieties that were able to slow canopy decay towards the end of season performed better. No significant difference was observed between plant density. It is expected that proposed system can be used to collect reliable data and develop variety and environment specific management practices to increase marketable yield and reduce production cost.
Tomato production faces constant pressure of biotic and abiotic stresses that can cause significant loss of production and fruit quality. In tropical and subtropical climates, the main disease affecting tomato production is caused by Tomato Yellow Leaf Curl Virus (TYLCV), a virus that is vectored by the silverleaf whitefly (Bemisia tabaci). The main method of control relies on insecticide spray to control the vector, avoiding the spread of the disease. Detecting and spatially locating infected plants are required to prevent and control epidemic outbreak of TYLCV. In this study, we aim to develop an unmanned aircraft system (UAS) based TYLCV detection algorithm that can identify affected plants and provide physiological information of the affected plants. Multi-temporal phenotypic attributes, e.g., canopy height, canopy cover, canopy volume, and vegetation indexes including normalized difference vegetation indexes (NDVI), soil adjusted vegetation index (SAVI), and excess green index (ExG) were extracted from the UAS image data. The field experiment was conducted at Texas A and M Agrilife Research and Extension Center at Weslaco, TX. A total of 16 tomato hybrids with different levels of TYLCV resistance were inoculated with viruliferous insects and randomly transplanted in open field with triplicates plots containing 4 plants. One control plot for each tomato hybrid with non-inoculated plants were also planted for validation. Machine learning techniques based on artificial neural networks were used to detect TYLCV symptoms in plants from UAS-driven parameters, and all the plants were tested by polymerase chain reaction (PCR) using specific primers to confirm TYLCV infection. To evaluate how early and accurately the algorithm can detect TYLCV symptoms in tomato plants, various detection models were developed by changing the period of input UAS data. We expect that the suggested system to be a useful framework for monitoring outbreak of TYLCV in large scales, giving the ability for the grower to determine the best time and location to start the vector control and also generate time series physiological data for better understanding of the disease progression.
Recent years have witnessed enormous growth in Unmanned Aircraft System (UAS) and sensor technology which made it possible to collect high spatial and temporal resolutions data over the crops throughout the growing season. The objective of this research is to develop a novel machine learning framework for marketable tomato yield estimation using multi-source and spatio-temporal remote sensing data collected from UAS. The proposed machine learning model is based on Artificial Neural Network (ANN) and it takes UAS based multi-temporal features such as canopy cover, canopy height, canopy volume, Excessive Greenness Index along with weather information such as humidity, precipitation, temperature, solar radiations and crop evapotranspiration (ETc) as input and predicts the corresponding marketable yield. The predicted yield is validated using the actual harvested yield. Breeders may be able to use the predicted yield as a parameter for genotype selection so that they can not only increase their experiment size for faster genotype selection but also to make efficient and informed decision on best performing genotypes. Moreover, yield prediction maps can be used to develop within-field management zones to optimize field management practices.
Unmanned Aerial System (UAS) is getting to be the most important technique in recent days for precision agriculture and High Throughput Phenotyping (HTP). Attributes of sorghum panicle, especially, are critical information to assess overall crop condition, irrigation, and yield estimation. In this study, it is proposed a method to extract phenotypes of sorghum panicles using UAS data. UAS data were acquired with 85% overlap at an altitude of 10m above ground to generate super high resolution data. Orthomosaic, Digital Surface Model (DSM), and 3D point cloud were generated by applying the Structure from Motion (SfM) algorithm to the imagery from UAS. Sorghum panicles were identified from orthomosaic and DSM by using color ratio and circle fitting. The cylinder fitting method and disk tacking method were proposed to estimate panicle volume. Yield prediction models were generated between field-measured yield data and UAS-measured attributes of sorghum panicles.
Field-based high-throughput phenotyping is a bottleneck to future breeding advances. The use of remote sensing with
unmanned aerial vehicles (UAVs) can change the way agricultural research operates by increasing the spatiotemporal
resolution of data collection to monitor status of plant growth. A fixed-wing UAV (Tuffwing) was operated to collect
images of a sorghum breeding research field with 70% overlap at an altitude of 120 m. The study site was located at Texas
A and M AgriLife Research’s Brazos Bottom research farm near College Station, Texas, USA. Relatively high-resolution
(>2.7cm/pixel) images were collected from May to July 2017 over 880 sorghum plots (including six treatments with four
replications). The collected images were mosaicked and structure from motion (SfM) calculated, which involves
construction of a digital surface model (DSM) by interpolation of 3D point clouds. Maximum plant height for each
genotype (plot) was estimated from the DSM and height calibration implemented with aerial measured values of groundcontrol
points with known height. Correlations and RMSE values between actual height and estimated height were
observed over sorghum across all genotypes and flight dates. Results indicate that the proposed height calibration method
has a potential for future application to improve accuracy in plant height estimations from UAVs.
The objective of this research is to develop a novel machine learning framework for automatic cotton genotype selection using multi-source and spatio-temporal remote sensing data collected from Unmanned Aerial System (UAS). The proposed machine learning model is based on Artificial Neural Network (ANN) and it takes UAS based multi-temporal features such as canopy cover, canopy height, canopy volume, Normalized Difference Vegetation Index (NDVI), Excessive Greenness Index along with non-temporal features such as cotton boll count, boll size and boll volume as input and predicts the corresponding yield. Testing the performance of our model using actual yield resulted in an R square value of approximately 0.9. The proposed cotton genotype selection model is expected to revolutionize the cotton breeding research by providing valuable tools to cotton breeders so that they can not only increase their experiment size for faster genotype selection but also make efficient and informed decision on best performing genotype selection.
Land leveling is the initial step for increasing irrigation efficiencies in surface irrigation systems. The objective of this paper was to evaluate potential utilization of an unmanned aerial system (UAS) equipped with a digital camera to map ground elevations of a grower’s field and compare them with field measurements. A secondary objective was to use UAS data to obtain a digital terrain model before and after land leveling. UAS data were used to generate orthomosaic images and three-dimensional (3-D) point cloud data by applying the structure for motion algorithm to the images. Ground control points (GCPs) were established around the study area, and they were surveyed using a survey grade dual-frequency GPS unit for accurate georeferencing of the geospatial data products. A digital surface model (DSM) was then generated from the 3-D point cloud data before and after laser leveling to determine the topography before and after the leveling. The UAS-derived DSM was compared with terrain elevation measurements acquired from land surveying equipment for validation. Although 0.3% error or root mean square error of 0.11 m was observed between UAS derived and ground measured ground elevation data, the results indicated that UAS could be an efficient method for determining terrain elevation with an acceptable accuracy when there are no plants on the ground, and it can be used to assess the performance of a land leveling project.