PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100801 (2019) https://doi.org/10.1117/12.2537997
This PDF file contains the front matter associated with SPIE Proceedings Volume 11008 including the Title Page, Copyright information, Table of Contents, Introduction, and Conference Committee listing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Advanced Sensors for Agriculture Optimization and Phenotyping
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100802 (2019) https://doi.org/10.1117/12.2518958
Microbolometer thermal cameras in UAVs and manned aircraft allow for the acquisition of highresolution temperature data, which, along with optical reflectance, contributes to monitoring and modeling of agricultural and natural environments. Furthermore, these temperature measurements have facilitated the development of advanced models of crop water stress and evapotranspiration in precision agriculture and heat fluxes exchanges in small river streams and corridors. Microbolometer cameras capture thermal information at blackbody or radiometric settings (narrowband emissivity equates to unity). While it is customary that the modeler uses assumed emissivity values (e.g. 0.99– 0.96 for agricultural and environmental settings); some applications (e.g. Vegetation Health Index), and complex models such as energy balance-based models (e.g. evapotranspiration) could benefit from spatial estimates of surface emissivity for true or kinetic temperature mapping. In that regard, this work presents an analysis of the spectral characteristics of a microbolometer camera with regard to emissivity, along with a methodology to infer thermal emissivity spatially based on the spectral characteristics of the microbolometer camera. For this work, the MODIS UCBS Emissivity Library, NASA HyTES hyperspectral emissivity, Landsat, and Utah State University AggieAir UAV surface reflectance products are employed. The methodology is applied to a commercial vineyard agricultural setting located in Lodi, California, where HyTES, Landsat, and AggieAir UAV spatial data were collected in the 2014 growing season. Assessment of the microbolometer spectral response with regards to emissivity and emissivity modeling performance for the area of study are presented and discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100803 (2019) https://doi.org/10.1117/12.2519216
We introduce a method to calculate evapotranspiration (ET) for individual plots in agricultural fields using the TSEB (Two-Source Energy Balance) model for high resolution thermal data from a UAS (unmanned aerial system). The model was developed for satellite remote sensing which has coarser spatial and temporal resolution. With the emergence of UAS remote sensing, this model needs to be adapted to be applied to the significantly higher resolution imagery. The average resolution of our thermal dataset is about 5 cm, which means we have multiple temperature measurements for a single plant, as opposed to satellite imagery which often views entire fields. The image resolution also means that soil contributes to overall temperature for certain pixels as well. A new algorithm is developed to classify pixels into 3 categories: soil, plant and mixture of soil and plant. Temperature distributions of plants are established and with other inputs like solar radiation, wind speed, plant height, we estimate ET distributions. Distributions of ET are acquired for the targeted plots in multiple images, and are evaluated versus stomatal conductance measurements from a steady state porometer.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100804 (2019) https://doi.org/10.1117/12.2519685
Tests of the most recent version of the two-source energy balance model have demonstrated that canopy and soil temperatures can be retrieved from high-resolution thermal imagery captured by an unmanned aerial vehicle (UAV). This work has assumed a linear relationship between vegetation indices (VIs) and radiometric temperature in a square grid (i.e., 3.6 m x 3.6 m) that is coarser than the resolution of the imagery acquired by the UAV. In this method, with visible, near infrared (VNIR), and thermal bands available at the same high-resolution, a linear fit can be obtained over the pixels located in a grid, where the x-axis is a vegetation index (VI) and the y-axis is radiometric temperature. Next, with an accurate VI threshold that separates soil and vegetation pixels from one another, the corresponding soil and vegetation temperatures can be extracted from the linear equation. Although this method is simpler than other approaches, such as TSEB with Priestly-Taylor (TSEB-PT), it could be sensitive to VIs and the parameters that affect VIs, such as shadows. Recent studies have revealed that, on average, the values of VIs, such as normalized difference vegetation index (NDVI) and leaf area index (LAI), that are located in sunlit areas are greater than those in shaded areas. This means that involving or compensating for shadows will affect the linear relationship parameters (slope and bias) between radiometric temperature and VI, as well as thresholds that separate soil and vegetation pixels. This study evaluates the impact of shadows on the retrieval of canopy and soil temperature data from four UAV images before and after applying shadow compensation techniques. The retrieved temperatures, using the TSEB-2T approach, both before and after shadow correction, are compared to the average temperature values for both soil and canopy in each grid. The imagery was acquired by the Utah State University AggieAir UAV system over a commercial vineyard located in California as part of the USDA Agricultural Research Service Grape Remote sensing Atmospheric Profile and Evapotranspiration Experiment (GRAPEX) Program during 2014 to 2016. The results of this study show when it is necessary to employ shadow compensation methods to retrieve vegetation and soil temperature directly.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100805 (2019) https://doi.org/10.1117/12.2519378
Thermal image quality is critical to accurately quantify spatial and temporal growth and stress patterns of field crops. Image quality can be impacted by several factor including environment, flying altitude, and camera focal length. Often times the thermal sensor selection is based upon price or already owned sensor. Metrics are available to select the flight altitude based on thermal sensor for desired ground resolution, however no study have been conducted to provide relative difference in image quality and efficiency of generating a thermal orthomosaic. Therefore, this study was conducted with goal to compare accuracy of canopy temperature quantification and assess the quality of thermal orthomosaic when using thermal sensor of different focal length and image acquisition at varying flying altitudes of a sUAS. Three thermal infrared cameras were selected with focal lengths of 9mm, 13mm, and 19mm. All three cameras were flown at altitudes of 20m, 50m, and 80m, to collect aerial imagery of 7,000 m2 soybeans field. The cameras were mounted on a rotary quadcopter. All flights were conducted at 3 m/s flying speed, and 1 second shutter trigger interval. A ground reference system provided ground truth data for thermometric transformations. Imagery data was compared to assess differences in number of images collected, percentage overlap required for 1 second shutter trigger interval, quality of orthomosaic and accuracy of canopy temperatures. Results show that 13 mm focal length and 50 m altitude result in a finer resolution orthomosaic. The canopy temperatures were quantified accurately regardless of altitude and focal length.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100806 (2019) https://doi.org/10.1117/12.2518822
This paper shows the comparison between multispectral and hyperspectral data collected from UAVs in detecting citrus nitrogen and water stresses. UAVs equipped with multispectral and hyperspectral sensors were flown over Citrus trees at Cal Poly Pomona’s Spadra Farm. The multispectral and/or hyperspectral data are used in the determination of normalized differential vegetation index (NDVI), water band index (WBI), and other vegetation indices. These indices are compared with the proximal sensor data that include handheld spectroradiometer, water potential meter, and chlorophyll meter. Correlations of multispectral and hyperspectral data with the proximal sensor data are shown.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100807 (2019) https://doi.org/10.1117/12.2518660
The purpose of this study is to examine water stress as determined by the stem water potential of two blocks of a commercial California vineyard growing two different varietals: Chenin Blanc and Cabernet. Hyperspectral reflectance data was collected from a system capturing spectra from 400 nm to 1,000 nm at a spectral resolution of 4 nm. The sensor was carried at an altitude of 75 meters above ground level by a helicopter UAS, producing a ground resolution of 3 cm. Sixty-six standard vegetative indices found in the literature were examined to see how well they predicted stem water potential utilizing simple linear regression. The five vegetative indices most related to stem water potential in terms of the coefficient of determination were the Photochemical Reflectance Index, the Green Red Ratio Index, the Ratio Vegetation Index, the Simple Ratio Index, and the Greenness Index. These indices had a coefficient of variation around 0.3. Nearly a third of the vegetative indices had a coefficient of determination less than 0.1 including the Water Band Index, the Water Index, and the Floating Position Water Band Index. Correlations of vegetative indices with stem water potentials were not improved by examining the two varietals separately.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100808 (2019) https://doi.org/10.1117/12.2518868
Timely and accurate recognition of health conditions in crops helps to perform necessary treatment for the plants. Automatically localizing these conditions in an image helps in estimating their spread and severity, thus saving on precious resources. Automated disease detection involving recognition as well as localization helps in identifying multiple diseases from one image and can be a small step forward for robotic farm surveying and spraying. Recent developments in Deep Neural Networks have drastically improved the localization and identification accuracy of objects. We leverage the neural network based method to perform accurate and fast detection of the diseases and pests in tea leaves. With a goal to identify an accurate yet efficient detector in terms of speed and memory, we evaluate various feature extraction networks and detection architectures. The images used to train and evaluate the models are with different resolutions, quality, brightness and focus as they are captured with mobile phones having different cameras through a participatory sensing approach. The experimental results show that the detection system effectively identifies and locates the health condition on the tea leaves in a complex background and with occlusion. We have evaluated YOLO based detection methods with different feature extraction architectures. Detection using YOLOv3 achieves mAP of about 86% with 50% IOU while making the system usable in real time.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Systems for Producing High-Quality Agricultural Remote-Sensing Data with sUAS
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100809 (2019) https://doi.org/10.1117/12.2519164
Small unmanned aerial systems (sUAS) for civil remote sensing applications are becoming ubiquitous for scientific insight about targets such as micro-agriculture, invasive species monitoring, and environmental survey applications. For the upcoming generation of beyond visual-line-of-sight (BVLOS) capable sUAS, a flexible, robust, and high-performance payload management and data collection software is required. In this paper, a high-rate, accurate, multispectral, modular, payload control software (“AggieCap3”) for such BVLOS sUAS is described, including system requirements, architecture, implementation, and results from both fixed- and rotary-wing craft, demonstrating the systems applicability for civilian scientific aerial remote sensing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080A (2019) https://doi.org/10.1117/12.2518541
Commercial off-the shelf systems of UAVs and sensors are touted as being able to collect remote-sensing data on crops that include spectral reflectance and plant height. Historically a great deal of effort has gone into quantifying and reducing the error levels in the geometry of UAV-based orthomosaics, but little effort has gone into quantifying and reducing the error of the reflectance and plant-height. We have been developing systems and protocols involving multifunctional ground-control points (GCPs) in order to produce crop phenotypic data that are as repeatable as possible. These multifunctional GCPs aid not only geometric correction, but also image calibration of reflectance and plantheight. The GCPs have known spectral-reflectance characteristics that are used to enable reference-based digital numberto-reflectance calibration of multispectral images. They also have known platform heights that are used to enable reference-based digital surface model-to-height maps. Results show that using these GCPs for reflectance and plantheight calibrations significantly reduces the error levels in reflectance (ca. 50% reduction) and plant-height (ca. 20% reduction) measurements.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080B (2019) https://doi.org/10.1117/12.2518260
Ground control points (GCPs) are critical for agricultural applications that require geographic registration, radiometric and height calibrations of images when overlaying images collected at different times. However, in terms of using conventional GCPs, it is time-consuming and labor-intensive to measure, distribute and collect all GCPs around a large field. An automatic mobile GCP is essential to replace the conventional GCPs, which can collaborate with the UAV during the flight allowing the mobile GCP to be captured in the images based on a proposed cooperation strategy. To investigate the plant phenotyping across a field based on multi-types of imaging sensors (RGB, multispectral, and thermal), two radiometric calibration references and two temperature calibration references were installed on the top of the mobile GCP. The mobile GCP used for auto-guidance driving is a four-wheel platform with differential speed steering control for front-wheel, which equipped with two RTK-GPS units for position determination, a navigation computer for path planning and tracking, an integrated driving controller for steering and traveling. The traveling angles and velocities of the wheels at each location can be generated using a path-tracking algorithm to follow a predefined driving map based on UAV’s flight planning. It is noteworthy that all positions of the mobile GCP can be recorded on the UAV during the flight for future image mosaicking. The automatic mobile GCP enables to reliably recognize and predict the behavior and activities of the UAVs in agricultural remote sensing and has a potential to improve the efficiency of the data collection in the field.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080D (2019) https://doi.org/10.1117/12.2518532
In breeding and genetics programs, thousands of small plots in the size of a few square meters each having a unique genetic entry, are used to evaluate huge numbers of candidates and large mapping populations. Low-altitude remote sensing with unmanned aerial systems (UAS) can generate high geospatial resolution measurements of plants and enable high temporal resolution measurements through multiple crop growth stages. However, to identify individual plot from aerial images robustly and automatically becomes a key challenge in high throughput phenotyping (HTP) using UAS. In this case study, we captured super high-resolution video clips of wheat canopies by a UAS at low altitude. We proposed an image processing pipeline for identifying individual plot from video frames. Preliminary results indicate the methods can highly accelerate the process of linking genotypes to individual plot images and can be fully automated. This research provides a proof-of-concept and has broad implications of novel phenomics application of UAS that is scalable to tens-of-thousands of plots in crop breeding and genetic studies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Control and Integration of UAS Sensing Systems for Agricultural Optimization and Phenotyping
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080E (2019) https://doi.org/10.1117/12.2519190
In recent days, phenotyping of various crops is gaining widespread popularity due to its ability to recognize variations in the effects of different genotypes of a particular crop in terms of its growth, yield, biomass, and so on. Such an application requires extensive data collection and analysis with a high spatial and temporal resolution, which can be attained using multiple sensors onboard Unmanned Aerial Vehicles (UAVs). In this study, we focus on harnessing information from a variety of sensors, such as RGB cameras, LiDAR units, and push-broom hyperspectral sensors – Short-wave Infrared (SWIR) and Visible Near Infrared (VNIR). The major challenge that needs to be overcome in this regard is to ensure an accurate integration of information captured across several days from the different sensor modalities. Moreover, the payload constraint for UAVs restrain us from mounting all the sensors simultaneously during a single flight mission, thus entailing the need for data capture from different sensors mounted on separate platforms that are flown individually over the agricultural field of interest. The first step towards integration of different data modalities is the generation of georeferenced products from each of the flight missions, which is accomplished with the help of Global Navigation Satellite Systems (GNSS) and Inertial Navigation Systems (INS) mounted on the UAVs that are time-synchronized with the onboard LiDAR units, cameras and/or hyperspectral sensors. Furthermore, an accurate georeferencing is achieved by developing robust calibration approaches dedicated towards accurate estimation of mounting parameters of the involved sensors. Finally, the geometric and spectral characteristics, such as canopy cover and leaf count, derived from the different sensors are used to devise a model to analyze the phenotypic traits of crops. The preliminary results indicate that the proposed calibration techniques can attain an accuracy of upto 3 cm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080G (2019) https://doi.org/10.1117/12.2518973
In the last years, the Unmanned Aerial Vehicles (UAVs), also known as drones, are becoming always more attractive due to their capacity of a rapid deployment and their wide range of application in many real world scenarios. Among the various fields of application, recently, the use of drones in the precision agriculture is becoming much relevant for the researchers community. The studies related to agriculture concern different aspects such as livestock monitoring, crops and water levels. The drones are able to perform these tasks thank to a series of different sensors and actuators equipped on board. Cameras on board allow, through opportune algorithms, the gathering of detailed information about plants health. If a health problem is detected then the the drone can intervene precisely on the specific problem. The contribution of this work is a communication protocols analysis applied to the problem of controlling a fleet of drones against parasites attacks to the crops. Moreover, the study of the different approaches aims to measure their performance and costs. In particular, the various approaches face also the issues of exploring the area in the shortest time possible avoiding that the same area is explored from more drones, discovering the parasites and preventing their proliferation spraying the right quantity of pesticide. The drones, being equipped with limited quantities of both fuel and pesticide, can ask for help to other drones to complete the elimination of the parasites. To face these last issues some recruitment protocols have been tested, focusing on bio-inspired one.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080H (2019) https://doi.org/10.1117/12.2518977
In the last few years, the rising interests in the field of UAV (Unmanned Aerial Vehicle) created new opportunities to develop applications and services in several domains. Some of the most important are the smart farming and precision agriculture domains. Thanks to the feasibility and the versatility of these systems it is possible to accomplish several jobs with the same device. In this application context, we propose a new system composed of a master smart station, some slave satellite stations and UAV fleet. It is developed on the basis of the M2M communication paradigm where devices cooperate between them in an autonomous way to perform assigned tasks. In particular, we designed a smart station which is responsible for managing fleet by assigning tasks and scheduling activities. Moreover, it is also responsible for managing satellite stations for assisting the fleet during its operations. In particular, these stations stock energy during their idle state. We propose an all-in-one hardware solution based on energy harvesting approaches all implemented in the smart stations. Moreover, they supply energy in an on-demand way to the UAV devices which are asking for recharge during their missions. Finally, we designed a customized protocol in an M2M environment based on the MQTT framework to ensure communication between devices and to coordinate operations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080J https://doi.org/10.1117/12.2520248
Current technologies employed for use of small, unmanned aerial systems (sUAS), or “drones”, for remote sensing (RS) activities that support the information needs of agricultural operations are expensive, provide relatively small geographic coverage, and typically produce data of limited scientific quality. Research is currently underway that will someday yield sense-and-avoid technologies that will enable sUAS to detect and safely avoid potential collisions with conventional and UAS air traffic. Other on-going research will result in the development of a nation-wide air traffic management system that will fully integrate UAS flights into US airspace. Together, these two research products will allow safe, long-duration sUAS flights, or flights that are “beyond-visual-line-of-sight” (BVLOS). As BVLOS technologies come on line, there will likely be a decrease in the per-acre cost of sUAS-based RS, which, in turn, will spur incentives for development of sUAS that can fly much further and cover much more area than platforms that are now in common use. A decrease in per-acre RS costs well likely result in an increased demand for RS technology in support of agricultural operations. However, adoption of BVLOS technologies that support greater access to US airspace for agricultural RS applications will present challenges to commonly used sUAS RS approaches that do not scale well as sUAS geographical coverage and flight times increase. Of particular concern are radiometric calibration of imagery and the cost and time required for generation of high-quality orthomosaics, point cloud algorithms, effects of temporal changes in surface spectral and thermal response, and delivery of time-sensitive image products to the grower. This paper examines the implications of these future challenges for sUAS RS in agriculture and offers insights into their solution that are topics of research today. In addition to examining the topics of radiometric calibration and orthorectification, other aspects, such as field protocols in the design and execution of flights and the need for additional on-board sensors, will be discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080K (2019) https://doi.org/10.1117/12.2519801
The modern farm is a technological marvel, from smart tractors to genetically modified organisms (GMOs), along with chemical pesticides and fertilizer. Farms today have continuously increased production by utilizing these various techniques. Many farms on the east coast of North America are growing dent or field corn while also rotating crops between soybeans of various types and winter wheat. These crops have become symbiotic in nature due to the need for specific soil nutrients of the crops and the practice of no till farming. More recently, schools with farm programs have started researching the use of drone technologies and multispectral analysis as a means to reduce chemical usage thereby saving farmers annual chemical costs. This paper investigates the use of drones in capstone projects for undergraduate engineering and computer science programs. Undergraduate capstone projects usually require a design and build element to satisfy ABET accreditation requirements. Therefore, the students needed to design and build an airframe capable of surveying farms with a multispectral camera. In the course of the aircraft design process it was discovered that the students needed to have a broader understanding of federal regulations, experimentation, and a robust understanding of how the drones and data would be used to benefit a typical farm. In addition, we look at the results obtained and discuss the problems associated with making the data and analysis accessible to the farmers who participated in our study. In the process we also discovered other potential uses for the images we created.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080L (2019) https://doi.org/10.1117/12.2519694
Theoretically, the appearance of shadows in aerial imagery is not desirable for researchers because it leads to errors in object classification and bias in the calculation of indices. In contrast, shadows contain useful geometrical information about the objects blocking the light. Several studies have focused on estimation of building heights in urban areas using the length of shadows. This type of information can be used to predict the population of a region, water demand, etc., in urban areas. With the emergence of unmanned aerial vehicles (UAVs) and the availability of high- to super-high-resolution imagery, the important questions relating to shadows have received more attention. Three-dimensional imagery generated using UAV-based photogrammetric techniques can be very useful, particularly in agricultural applications such as in the development of an empirical equation between biomass or yield and the geometrical information of canopies or crops. However, evaluating the accuracy of the canopy or crop height requires labor-intensive efforts. In contrast, the geometrical relationship between the length of the shadows and the crop or canopy height can be inversely solved using the shadow length measured. In this study, object heights retrieved from UAV point clouds are validated using the geometrical shadow information retrieved from three sets of high-resolution imagery captured by Utah State University’s AggieAir UAV system. These flights were conducted in 2014 and 2015 over a commercial vineyard located in California for the USDA Agricultural Research Service Grape Remote sensing Atmospheric Profile and Evapotranspiration Experiment (GRAPEX) Program. The results showed that, although this approach could be computationally expensive, it is faster than fieldwork and does not require an expensive and accurate instrument such as a real-time kinematic (RTK) GPS.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080M (2019) https://doi.org/10.1117/12.2518721
Boll weevil (Anthonomus grandis), a major pest in cotton, has been eradicated from almost all states in the US. Volunteer cotton (Gossipium hirsutum), growing between seasons provides the perfect habitat for boll weevil to survive during winter allowing the spread of the pest. The boll weevil eradication program in South Texas works extensively trying to extirpate this pest, and the early detection of volunteer cotton in grain fields is detrimental for this process. In this study, we investigated images collected from a five-band multispectral camera mounted on an Unmanned Aerial Vehicle (UAV) to detect volunteer cotton in a cornfield. The core objective of this study was to compare accuracies of difference image classification techniques in detection of volunteer cotton in cornfield. In this study, we used second-order co-occurrence filter with 3x3 moving matrix in eight directions (0°, 45°, 90°, 135°, 180°, 235°, 270°, 315°) to extract eight textural features (mean, variance, homogeneity, contrast, dissimilarity, entropy, second moment and correlation) for each direction. Parallelepiped, Maximum Likelihood and Mahalanobis distance supervised classifications were used in the direction of maximum obtained accuracy (270o ) for the five-band stacked and NDVI images. Overall accuracy and Kappa coefficients were determined and compared for all three classified results. Five-band stacked image resulted in the highest overall accuracy of 91.75% and Kappa coefficient of 0.87 for Mahalanobis distance classification whereas NDVI resulted in the highest overall accuracy of 85.73% and Kappa coefficient of 0.76 for the same classification. Maximum likelihood was found to be the best for classifying cotton with a class accuracy of 70.07% using five-band stacked image while Mahalanobis distance was found to be the best for classifying cotton with a class accuracy of 54.93% for NDVI image.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080N (2019) https://doi.org/10.1117/12.2519394
The fungus Phymatotrichpsis omnivora, also called cotton root rot (CRR), is one of the most deadly cotton diseases in the Southwest U.S. Once the cotton is infected by CRR it is very unlikely for it to be cured. Previous research indicates that the CRR will reoccur at a similar area as previous years. A fungicide known as Topguard Terra was proven efficient in CRR prevention. Therefore, knowing the historical CRR-infested area is helpful to prevent CRR from appearing again in the future. The CRR-infested plants can be detected by using aerial remote sensing. When an unmanned aerial vehicle (UAV) was introduced to a remote sensing research field, the spatial and temporal resolution of imagery data increased significantly and higher precision CRR classification was made possible. A plant-by-plant (PBP) level classification based on the Superpixel concept was developed to identify CRR-infested and healthy cotton plants in the field at the single plant level. The PBP classification algorithm was improved to achieve fewer misclassifications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080O (2019) https://doi.org/10.1117/12.2519129
Recent years have witnessed enormous growth in Unmanned Aircraft System (UAS) and sensor technology which made it possible to collect high spatial and temporal resolutions data over the crops throughout the growing season. The objective of this research is to develop a novel machine learning framework for marketable tomato yield estimation using multi-source and spatio-temporal remote sensing data collected from UAS. The proposed machine learning model is based on Artificial Neural Network (ANN) and it takes UAS based multi-temporal features such as canopy cover, canopy height, canopy volume, Excessive Greenness Index along with weather information such as humidity, precipitation, temperature, solar radiations and crop evapotranspiration (ETc) as input and predicts the corresponding marketable yield. The predicted yield is validated using the actual harvested yield. Breeders may be able to use the predicted yield as a parameter for genotype selection so that they can not only increase their experiment size for faster genotype selection but also to make efficient and informed decision on best performing genotypes. Moreover, yield prediction maps can be used to develop within-field management zones to optimize field management practices.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080P (2019) https://doi.org/10.1117/12.2518703
Tomato production faces constant pressure of biotic and abiotic stresses that can cause significant loss of production and fruit quality. In tropical and subtropical climates, the main disease affecting tomato production is caused by Tomato Yellow Leaf Curl Virus (TYLCV), a virus that is vectored by the silverleaf whitefly (Bemisia tabaci). The main method of control relies on insecticide spray to control the vector, avoiding the spread of the disease. Detecting and spatially locating infected plants are required to prevent and control epidemic outbreak of TYLCV. In this study, we aim to develop an unmanned aircraft system (UAS) based TYLCV detection algorithm that can identify affected plants and provide physiological information of the affected plants. Multi-temporal phenotypic attributes, e.g., canopy height, canopy cover, canopy volume, and vegetation indexes including normalized difference vegetation indexes (NDVI), soil adjusted vegetation index (SAVI), and excess green index (ExG) were extracted from the UAS image data. The field experiment was conducted at Texas A and M Agrilife Research and Extension Center at Weslaco, TX. A total of 16 tomato hybrids with different levels of TYLCV resistance were inoculated with viruliferous insects and randomly transplanted in open field with triplicates plots containing 4 plants. One control plot for each tomato hybrid with non-inoculated plants were also planted for validation. Machine learning techniques based on artificial neural networks were used to detect TYLCV symptoms in plants from UAS-driven parameters, and all the plants were tested by polymerase chain reaction (PCR) using specific primers to confirm TYLCV infection. To evaluate how early and accurately the algorithm can detect TYLCV symptoms in tomato plants, various detection models were developed by changing the period of input UAS data. We expect that the suggested system to be a useful framework for monitoring outbreak of TYLCV in large scales, giving the ability for the grower to determine the best time and location to start the vector control and also generate time series physiological data for better understanding of the disease progression.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080Q (2019) https://doi.org/10.1117/12.2519157
This paper presents the development and validation of machine learning models for the prediction of water and nitrogen stresses in lettuce. Linear regression and deep learning neural networks, mainly convolutional neural networks (CNNs), are used to train the machine learning models. The data used for the training include both airborne and proximal sensor data. The airborne data used are digital images collected from unmanned aerial vehicles (UAVs) and the normalized difference vegetation index (NDVI) obtained from airborne multispectral images. Chlorophyll meter, water potential meter, and spectroradiometer are the proximal sensors used. Also used for the training are agronomic measurements such as leaf count and plant height. For the validation of the developed models, two sets of tests were performed. The first test used a set of data similar to the training data, but different from the training data. The second test used aerial images of various random lettuce plots at farms obtained from Google Maps. The second test evaluates the models’ portability and performance in an unknown environment using the data that was not collected from the experimental plot. The goal of the machine learning algorithms is to provide precise detection of nitrogen and water stresses on a plant level basis using just the digital images collected from UAVs. This will help reduce the cost associated with precision agriculture.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080R (2019) https://doi.org/10.1117/12.2520536
Applying machine learning methods and analysis on remotely sensed color, multispectral, and thermal imagery has been recognized as a potentially cost-effective approach for detecting the location of various weed species in-field. This detection approach has the potential to be an important first step for broader Site-Specific Weed Management procedures (SSWM). The objective of this research was to create a method for automating the detection of weeds in corn and soybean fields, at different stages of the growing season. Sensors based on an unmanned aerial vehicle were used to capture imagery used for this research. We focused on identifying four common weed types present in Midwestern fields. This research involved: 1) collecting color, multispectral, and thermal imagery from UAV based sensors in corn and soybean fields throughout the 2018 growing season, 2) creating individual normalized differential vegetation index (NDVI) images from the near-infrared (NIR) and red multispectral bands 3) applying image thresholding and smoothing techniques on the NDVI imagery , 4) manually drawing bounding boxes and hand labelling vegetation blobs from the processed imagery using color images as the ground truth, 5) developing a training set of these processed, labeled images that represent weeds at different crop growth stages. Preliminary results of these methods show promise in creating an affordable first step to target herbicide application.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080S (2019) https://doi.org/10.1117/12.2519743
Soil moisture is a key component of water balance models. Physically, it is a nonlinear function of parameters that are not easily measured spatially, such as soil texture and soil type. Thus, several studies have been conducted on the estimation of soil moisture using remotely sensed data and data mining techniques such as artificial neural networks (ANNs) and support vector machines (SVMs). However, all models developed based on these techniques are limited to site-specific applications where they are trained and their parameters are tuned. Moreover, since the system of non-linear equations produced by and conducted in the machine learning process are not accessible to researchers, each application of these machine learning approaches must repeat these training steps for any new study area. The fact that the results of this machine learning, black box approach cannot be easily transferred to different locations for extraction of soil moisture estimates is frustrating, and it can lead to inaccurate comparisons between methods or model performance. To overcome the Black-box issue, this study employed a powerful technique called genetic programming (GP), which is a combination of an evolutionary algorithm and artificial intelligence, to simulate soil moisture at different levels using high-resolution, multispectral imagery acquired with an unmanned aerial vehicle (UAV). The output of this approach is either a linear or nonlinear empirical equation that can be used by others. The performance of GP was compared with ANN and SVM modeling results. Several sets of high-resolution aerial imagery captured by the Utah State University AggieAir UAV system over two experimental pasture sites located in northern and southern Utah were used for this soil moisture estimation approach. The inputs used to train these models include the reflectance for the visible, near-infrared (NIR), and thermal bands. The results show (1) the performance of GP versus ANN and SVM and (2) the master equation provided by GP, which can be used in other locations and applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
UAV and Ground-based Sensing Applications for Agricultural Optimization and Phenotyping
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080T https://doi.org/10.1117/12.2518827
Small Unmanned Aerial Vehicles (UAVs) have created an opportunity to remotely gather very high-resolution geospatial data for a variety of applications. Recently, UAvs have been widely employed in collecting quantitative measures of crop health indicators such as vegetation and ground cover characteristics. An important crop health indicator is the vertical vegetation structure, which can provide information on plant health and yield. High-resolution data such as optical images or LIDAR point cloud collected by UAVs provide vegetation heights that have proven to be reliable and cost-effective when compared to field methods. With optical images, heights are generated from stereo pairs via photogrammetric techniques while heights are directly estimated from laser point clouds. In this article, we show the potential of data collected by UAV platforms in estimating vegetation heights of citrus plants.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080U (2019) https://doi.org/10.1117/12.2519072
Improving throughput and accuracy of plant phenotyping is core to continued advances in breeding to ensure genetic gain to meet global food demand. Current manual phenotyping requires enormous investments in time, cost, and labor as quantitative values are required for thousands of genetic varieties across different environments. In soybean, a genotype’s maturity governs the geography for which it is adapted and has an impact on yield, which must be controlled for in breeding to realize genetic gain. In this work, we developed and validated a method for highthroughput phenotyping of soybean maturity using high resolution, visual, RGB, imagery collected using an unmanned aerial vehicle (UAV). We illustrate a method to automatically derive maturity date by modeling change through time of a quantitative assessment of canopy greenness on a per plot basis. The efficacy of the analytical framework is compared to the manual scoring system by evaluating phenotypic and genetic correlations and genetic repeatability measures. Analysis of replicated experiments from multiple locations yielded high phenotypic correlations (R2 = 0.85 - 0.96) between manual and UAV derived maturity scores. Heritability of the maturity estimates from the proposed remote sensing method is comparable to that of manual scoring. Implementation of this system has allowed for improved scale, cost efficiencies and data quality for soy maturity data collected via UAV remote sensing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080X (2019) https://doi.org/10.1117/12.2518725
Wheat, third most important cereal in the world, is sensitive to nitrogen deficiency. To increase yield, nitrogen (N) inputs are used but production costs may exceed returns if unnecessary applications are made; and the environment may become polluted. To improve N management, farmers of the mid-Atlantic generally apply N to wheat based on actual plant growth by counting the number of tillers or N concentration in the plant tissues. Both methods can be labor intensive and time consuming, and tissue testing also requires additional production costs. Remote-sensing technologies and more particularly Unmanned Aerial Vehicle (UAV) systems are now being used to extract new variables (spectral reflectance and vegetation indices) and to estimate plant growth and N requirements. Previous studies in Virginia have shown that spectral reflectance data, collected using the ground GreenSeeker® system, could be used to estimate the number of tillers and tissue nitrogen content. The objective of this project was to evaluate the accuracy of remote sensing and UAV-based wheat spectral reflectance for estimating tiller density in winter wheat. Tillers were counted regularly and simultaneously with ground (using handheld GreenSeeker®) and aerial (using UAV) NDVI measurements. Each UAV flight was performed using a Red Green Blue (RGB) and Tetracam (Near InfraRed) camera to extract NDVI and color space indices. Our results showed significant correlations between the number of tillers and aerial indices but further analysis is needed to identify the best flight time for estimating wheat tiller density and early season N requirements.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080Y (2019) https://doi.org/10.1117/12.2520252
Plant density estimation during the emergence phase is critical for early-season decision making. Estimation of both crop and weed density is critical for addressing early season issues. Mapping of weeds in crops at any stage can be useful; however, early competition from weeds is frequently most detrimental to yield. The objectives of this study were to develop a set of algorithms that accurately estimated the crop and weed density at emergence from sUAS imagery, and to do so using methods which were operationally feasible on production-field scale. The areas of interest were Mississippi cotton fields, where weeds were present. The imagery was collected using the standard integrated camera on a DJI Phantom 4 Pro quadcopter. A Hough transform-based approach for density estimation of crop and weed was used. The detection process began by extracting all plants from the soil background based on visible atmospherically resistant index values, and further discriminated between crop and weed using Hough line transform, followed by connected component analysis. The algorithm development utilized five subsets of image data collected, where overall accuracy was 83%. The algorithm was applied to a different production cotton field in the following year. Overall accuracy remained the same; however, commission error was reduced. The addition of near infrared reflectance could improve accuracies as many errors were due to a lack of “greenness” in plants, which is the primary factor in assigning visible atmospherically resistant index values.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 110080Z (2019) https://doi.org/10.1117/12.2519199
With the advent of sUAS, research scientists and plant managers are capable of obtaining unique, fast, and low-cost quantitative data, which delivers many repeatable survey options. Benefits of autonomous sUAS platforms include minimal training, reduced human safety concerns, and creation of graphic outputs which may be readily viewed by any stakeholder who was not actively involved in the survey or management activity. Research conducted in the Wellington Region, New Zealand was used to evaluate consumer-grade sUAS technologies to map and estimate standing biomass of Manchurian Wild Rice (MWR), an exotic semi-aquatic grass which promotes flooding, and displacement of native flora and fauna. The goal of this research was to improve the speed and resolution of current survey strategies used to assess MWR among a lowland pasture site using unmanned systems and photogrammetry techniques. Image collection and data processing was conducted in a manner to provide a theoretic biomass estimation of remaining MWR following seasonal growth and herbicide applications. Post-processing methods and theories discussed attempt to identify and quantify MWR biomass using supervised imaging analysis, plant height modeling, and biomass collected in situ. The use of unmanned systems to map, monitor, and manage MWR is encouraged for future applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100811 (2019) https://doi.org/10.1117/12.2518762
The quality of grapes in the production of wine is highly influenced by vine water status, where optimal water deficit or selective harvesting can improve berry quality. It is in this context that the rapid advancement in small unmanned aerial system (sUAS) technology and the potential application of real-time, high-spatial resolution hyperspectral imagery for vineyard moisture assessment, have become tractable. This study sought to further sUAS hyperspectral imagery as a tool to model water status in a commercial vineyard in Upstate New York. High-spatial resolution (2.5 cm ground sample distance) hyperspectral data were collected in the visible/near-infrared (VNIR; 400-1000nm) regime on three flight days. A Scholander pressure chamber was used to directly measure the midday stem water potential (Ψstem) within imaged vines at the time of flight. High spatial resolution pixels enabled the targeting of pure (sunlit) vine canopy with vertically trained shoots and significant shadowing. We used the partial least squares-regression (PLS-R) modeling method to correlate our hyperspectral imagery with measured field water status and applied a wavelength band selection scheme to detect important wavelengths. We evaluated spectral smoothing and band reduction approaches, given signal-to-noise ratio (SNR) concerns. Our regression results indicated that unsmoothed curves, with the range of wavelength bands from 450- 1000 nm, provided the highest model performance with R2 = 0.68 for cross-validation. Future work will include hyperspectral flight data in the short-wave infrared (SWIR; 1000-2500 nm) regime that were also collected. Ultimately, models will need validation in different vineyards with a full range of plant stress.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, 1100812 (2019) https://doi.org/10.1117/12.2518955
Unmanned Aerial System (UAS) is becoming a popular choice when acquiring fine spatial resolution images for precision agriculture applications. Compared to other remote sensing data collection platforms, UAS can acquire image data at relatively lower cost with finer spatial resolution with more flexible schedule. In recent years, multispectral sensors that can capture near infrared (NIR) and red edge spectral reflectance have been successfully integrated with UAS, and it is offering more versatility in soil and field analysis, crop monitoring, and plant health assessment. In this study, we aim to investigate the capability of UAS-based crop monitoring system to determine the best management practices for 3 different tomato varieties comparing different planting dates, plant density, use of plastic mulch and fertilization rate. The field and UAS data were acquired during Spring 2016, 2017, and 2018 located in Weslaco, TX. To compare the effect of various treatments in cropping systems, physiological parameters and vegetation indexes (Canopy Cover, Canopy Height, Canopy Volume and Excess Greenness) were extracted from red, green, blue (RGB) data and correlated with final yield data to evaluate practice/treatment to maximize tomato yield. During Spring 2016, we observed highest yield from the early March planting date using white plastic mulch. The results also indicated that the variety yielded higher presented a slow canopy decay towards the end of the season. In Spring 2017, there were differences in yield among the three tomato varieties depending on the fertilization rate, DRP-8551 performed better at low nitrogen level, Mykonos performed better on the two higher nitrogen rates and TAM-Hot-Ty had no significant difference among treatments. Finally, during Spring 2018, it was observed that early March produced the best yields and varieties that were able to slow canopy decay towards the end of season performed better. No significant difference was observed between plant density. It is expected that proposed system can be used to collect reliable data and develop variety and environment specific management practices to increase marketable yield and reduce production cost.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.