PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 1174701 (2021) https://doi.org/10.1117/12.2598648
This PDF file contains the front matter associated with SPIE Proceedings Volume 11747, including the Title Page, Copyright information, and Table of Contents.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 1174702 https://doi.org/10.1117/12.2597580
Introduction to SPIE Defense and Commercial Sensing conference 11741: Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 1174703 (2021) https://doi.org/10.1117/12.2586694
Leaf area index (LAI) is an important phenotypic trait closely related to plant vigor and biomass. It is also a key parameter used in crop growth modeling. However, manually measuring LAI in the field can be slow and labor intensive. High resolution remote sensing, such as unmanned aircraft systems (UAS), has been explored for LAI estimation but with limited data sources, usually RGB and multispectral imagery. As UAS-based thermal infrared (TIR) imaging becoming readily available in agriculture, it is worth investigating the potential of its role in improving LAI estimation. In this study we evaluated the importance of canopy temperature measured by UAS-based TIR and multispectral imagery on maize LAI quantification within a breeding context (23 genotypes). Five plot-level features (canopy temperature, structure and two common vegetation indices) were extracted from the images, and used as inputs of machine learning models for the LAI estimation. The performance of the estimation was evaluated with a 5-fold cross validation with 30 random repeats for 162 samples. Results showed that, canopy temperature, together with canopy structure as model predictors, slightly improved LAI estimation (root mean square error, RMSE of 0.853 m2/m2 and coefficient of determination, R2 of 0.740) than those models without temperature difference (RMSE of 0.917 m2/m2 and R2 of 0.706) for the various genotypes included in this study. In addition, canopy temperature showed moderate and more stable significance in estimating LAI than plant height and image uniformity. Its contribution to the estimation was comparable or even higher than those from vegetation indices when being modeled with random forest in this study. These relationships may be changed with a single or less genotypes which can be explored in future studies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 1174704 (2021) https://doi.org/10.1117/12.2587100
Automated and cost-effective phenotyping pipelines are needed to efficiently characterize new lines and hybrids developed in plant breeding programs. In this study, we employ deep neural networks (DNNs) to model individual maize plants using 3D point cloud data derived from unmanned aerial systems (UAS) imagery by PointNet network. The experimental setup was performed at the Indiana Corn and Soybean Innovation Center at the Agronomy Center for Research and Education (ACRE) in West Lafayette, Indiana, USA. On June 17th, 2020 a flight was carried out over maize trials using a custom designed UAS platform with a Sony Alpha ILCE-7R photogrammetric sensor. RGB images were processed by a standard photogrammetric pipeline by Structure from Motion (SfM) to reconstruct the study field into a final scaled 3D point cloud. 50 individual maize plants were manually segmented from the point cloud to train the DNN and subsequently individual plants were extracted over a test trial with more than 5,000 plants. Moreover, to reduce overfitting in the fully-connected layers, we employed data augmentation not only in translation, but also in color intensity. Results show a successful rate for the extraction of the individual plants of 72.4%. Our test trial demonstrates the possibility of using deep learning to overcome the individual maize extraction challenge on the basis of UAS data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 1174706 (2021) https://doi.org/10.1117/12.2585806
The common techniques used to estimate tree canopy coverage are line-intercept, spherical densiometer, moosehorn or hemispherical photography, all which demand intensive manual operations both in data collection (typically underneath the trees) and in post-processing the results, calculations and reports. These labor-intensive techniques result in high costs and are difficult to apply to large scale areas. We propose acquiring airborne images by flying a low-altitude drone with a built-in digital camera over a large-scale vineyard. The airborne images convey all necessary information, and the image analysis techniques plus deep learning neural network can create a set of regression models for the anticipated calculations. Specifically, we can predict leaf area index (LAI) and percent canopy cover, which will provide guidance for planting intercrops or cover crops to prevent soil erosion and improve soil health, determine the photosynthetic and transpirational surface of plant canopies, ecophysiology, water balance modeling, in calculating the correct amounts of foliar sprays of pesticides or fungicides, and characterization of vegetation-atmosphere interactions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Past and Future of Practice Remote Sensing in Agriculture
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470A (2021) https://doi.org/10.1117/12.2587694
In this paper, we document lessons learned from using ViSOAR Ag Explorer™ in the fields of Arkansas and Utah in the 2018-2020 growing seasons. Our insights come from creating software with fast reading and writing of 2D aerial image mosaics for platform-agnostic collaborative analytics and visualization. We currently enable stitching in the field on a laptop without the need for an internet connection. The full resolution result is then available for instant streaming visualization and analytics via Python scripting. While our software, ViSOAR Ag Explorer™ removes the time and labor software bottleneck in processing large aerial surveys, enabling a cost-effective process to deliver actionable information to farmers, we learned valuable lessons with regard to the acquisition, storage, viewing, analysis, and planning stages of aerial data surveys. Additionally, with the ultimate goal of stitching thousands of images in minutes on board a UAV at the time of data capture, we performed preliminary tests for on-board, real-time stitching and analysis on USU AggieAir sUAS using lightweight computational resources. This system is able to create a 2D map while flying and allow interactive exploration of the full resolution data as soon as the platform has landed or has access to a network. This capability further speeds up the assessment process on the field and opens opportunities for new real-time photogrammetry applications. Flying and imaging over 1500-2000 acres per week provides up-to-date maps that give crop consultants a much broader scope of the field in general as well as providing a better view into planting and field preparation than could be observed from field level. Ultimately, our software and hardware could provide a much better understanding of weed presence and intensity or lack thereof.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470B https://doi.org/10.1117/12.2588153
Unmanned aerial vehicles have been available for more than a decade and have been part of tests and efforts to be incorporated in agricultural applications, which was achieved to a minor degree. Still, advancements in science and technology in the same period have achieved maturity allowing the transition of this found knowledge into farming decision-making agents. Still, challenges are present towards the “last mile problem” or the seamless integration of UAV products into farming activities. Besides, having addressed the fundamental questions about UAV technology in the last decade, UAVs can answer questions that have been challenging to respond to in agriculture and open opportunities to advance the technification and automation of farming further. This presentation will provide a personal view of unmanned aerial vehicles’ challenges, and opportunities as regulation, technology, and algorithms evolve for the next decade.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470D (2021) https://doi.org/10.1117/12.2587495
The vast majority of agricultural remote sensing applications that utilize multispectral imagery require several pre-processing techniques in order to provide a basis on which to accurately analyze data and provide meaningful information to the grower. This research takes these techniques and compresses them into a fully-automated data processing pipeline. This pipeline is implemented using a BeamIO TileDriver workflow, converting raw digital count to direct-georectified reflectance, ready for further processing to provide a geolocated information product for the grower.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470E (2021) https://doi.org/10.1117/12.2585866
Unmanned Aerial Vehicle (UAV)-based remote sensing techniques have significant potential in agriculture and smart farming applications for the efficient monitoring of plant growth, the irrigation process, disease detection, etc. Most research on field phenotyping with remote sensing was accomplished by a typical UAV equipped with an RGB camera or a multispectral camera over a large farm field. Due to the effects of wind disturbances on point-cloud generation processing with a single-camera image captured from the UAV, precise field phenotyping measurement for crop breeding and agriculture production requires the simultaneous collection of images by multiple cameras that are far enough apart to provide for structure from motion calculations. To improve digital surface models by minimizing measurement errors caused by the motion of the UAV and plants during a flying mission, a cooperative operation system of multiple UAVs was proposed to enable the simultaneous collection of images from different perspectives. A coordinated navigation system based on the Robot Operation System was constructed to compute control commands to stabilize pose control and the location of the UAVs. Based on a leader-follower formation control algorithm through a wireless network system, a follower UAV performed coordination with a leader UAV to maintain the desired constant speed, direction, and percentage of image overlap in a synchronized motion, ultimately enabling task achievement in a short time and improvement of target models based on 3D reconstruction. To validate the performance of the proposed method, measurement errors of field phenotyping, obtained from synchronized multiple UAV-based image collection, were compared with the single UAV-based image collection in simulation and field tests.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Ground-based and Robotic Systems in Agricultural Sensing and Phenotyping
Calvin Coopmans, Stephen Brimhall, Ryan Goodman, Steve Petruzza
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470I (2021) https://doi.org/10.1117/12.2588009
Aerial remote sensing continues its progression into wider applications and functionality. At the same time, the field of robotics has generated many useful technologies such as the Robotics Operating System v.2 (ROS2) which allows researchers and designers to implement autonomous systems as a meta-operating system. In this paper, such a holistic autonomous data collection system (STARDOS) is shown with the goal of integration of remote sensing data collection (payload) and the robotic platform (in this case, unmanned aerial vehicles). The architecture of STARDOS is shown, along with example payload configurations and analysis use cases such as triangular greeness index (TGI) and feature extraction and matching for real-time 2d photogrammetry.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
UAV Remote Sensing for Measuring Evapotranspiration and Moisture Factors in Crops
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470J (2021) https://doi.org/10.1117/12.2591895
Agroecosystems compose large economic sectors in dominantly agriculture-based societies. Availability and management of water resources have a huge influence on the sustainability of agroecosystems. Low soil moisture is a major constraint on crop growth due to its vital role in providing crops with sufficient nutrition for root uptake. Current methodologies in precision agriculture are insufficient for direct soil moisture sensing since reflected shortwave solar radiation and infrared long-wave emission can only provide information about surface characteristics. While microwave signals are known to be highly sensitive to water within plants and soil, its implementation from small Unmanned Aircraft Systems (UAS) platforms are at relatively low technological readiness level compared to the use of shortwave / longwave optical sensors. In this paper, we summarize our efforts to apply radio frequency (RF) / microwave remote sensing from UAS for water utilization in agroecosystems. Recently, we developed a comprehensive UAS-based RF testbed, including a microwave radiometer, a scatterometer, wideband ground penetrating radar system as well as Signals of Opportunity (SoOp) receivers. These instruments operate from UAS platforms and use the microwave / radio wave portions of the spectrum. The testbed is accompanied with proximal sensing via autonomous unmanned ground vehicles that acquire in- situ soil moisture and vegetation geophysical parameters to provide appropriate datasets for training and testing physics aware, machine learning-based models. In this paper, we introduce the RF sensing framework that can enable non-intrusive high-resolution soil moisture estimates at multiple depths of soil via UAS-based active / passive / SoOp RF instruments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470K (2021) https://doi.org/10.1117/12.2587763
sUAS (small-Unmanned Aircraft System) and advanced surface energy balance models allow detailed assessment and monitoring (at plant scale) of different (agricultural, urban, and natural) environments. Significant progress has been made in the understanding and modeling of atmosphere-plant-soil interactions and numerical quantification of the internal processes at plant scale. Similarly, progress has been made in ground truth information comparison and validation models. An example of this progress is the application of sUAS information using the Two-Source Surface Energy Balance (TSEB) model in commercial vineyards by the Grape Remote sensing Atmospheric Profile and Evapotranspiration eXperiment - GRAPEX Project in California. With advances in frequent sUAS data collection for larger areas, sUAS information processing becomes computationally expensive on local computers. Additionally, fragmentation of different models and tools necessary to process the data and validate the results is a limiting factor. For example, in the referred GRAPEX project, commercial software (ArcGIS and MS Excel) and Python and Matlab code are needed to complete the analysis. There is a need to assess and integrate research conducted with sUAS and surface energy balance models in a sharing platform to be easily migrated to high performance computing (HPC) resources. This research, sponsored by the National Science Foundation FAIR Cyber Training Fellowships, is integrating disparate software and code under a unified language (Python). The Python code for estimating the surface energy fluxes using TSEB2T model as well as the EC footprint analysis code for ground truth information comparison were hosted in myGeoHub site https://mygeohub.org/ to be reproducible and replicable.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470L (2021) https://doi.org/10.1117/12.2587577
Leaf stomata regulate the process of gas exchange between the plant and the atmosphere, therefore play an important role in plant growth and water use. Thermal infrared sensing of leaf surface temperature is proved to be an indirect but effective approach to estimate leaf stomatal conductance, and shows the potential to rapidly differentiate genotypes for water-use related traits. The objective of this study was to estimate leaf stomatal conductance from thermal IR images of crops and relevant environmental parameters. The experiment was conducted in the NU-Spidercam field phenotyping facility near Mead, NE. Leaf stomatal conductance was measured from soybean, sorghum, maize, and sunflower using a leaf porometer. Thermal IR images of the crop canopies were captured by a thermal IR camera and then processed to extract crop canopy temperature (Tc). In addition, weather variables including solar radiation, air temperature, relative humidity, and wind speed were extracted from a nearby weather station. Correlation analysis was implemented to explore the relationships between these variables. Multiple linear regression (MLR), random forest (RF), gradient boosting machine (GBM) were applied to model stomatal conductance from Tc and weather variables. The Pearson correlation coefficients between predicted and measured stomatal conductance were 0.495 for MLR, 0.591 for RF, and 0.878 for GBM when Tc was not used as an input variable. After adding Tc as input, Pearson correlation coefficients were improved to 0.584 for MLR, 0.593 for RF, and 0.896 for GBM. The mean absolute errors for the three models were 225, 237, and 129 mmol/(m2·s) when Tc was included as a model input. This research would lead to rapid assessment of leaf stomatal conductance and crop water status using thermal IR imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI, 117470N (2021) https://doi.org/10.1117/12.2586259
Accurate quantification of the partitioning of evapotranspiration (ET) into transpiration and evaporation fluxes is necessary to understanding ecosystem interactions among carbon, water, and energy flux components. ET partitioning can also support the description of atmosphere and land interactions and provide unique insights into vegetation water status. Previous studies have identified leaf area index (LAI) estimation as a key descriptor of biomass conditions needed for the estimation of transpiration and evaporation. LAI estimation in clumped vegetation systems, such as vineyards and orchards, has proven challenging and is strongly related to crop phenological status and canopy management. In this study, a feature extraction model based on previous research was built to generate a total of 202 preliminary variables at a 3.6-by-3.6- meter-grid scale based on submeter-resolution information from a small Unmanned Aerial Vehicle (sUAV) in four commercial vineyards across California. Using these variables, a machine learning model called eXtreme Gradient Boosting (XGBoost) was successfully built for LAI estimation. The XGBoost built-in function requires only six variables relating to vegetation indices and temperature to produce high-accuracy LAI estimation for the vineyard. Using the sixvariable XGBoost-based LAI map, two versions of the Two-Source Energy Balance (TSEB) model, TSEB-PT and TSEB- 2T were used for energy balance and ET partitioning. Comparing these results with the Eddy-Covariance (EC) tower data, showed that TSEB-PT outperforms TSEB-2T on the estimation of sensible heat flux (within 13% relative error) and surface heat flux (within 34% relative error), while TSEB-2T outperforms TSEB-PT on the estimation of net radiation (within 14% relative error) and latent heat flux (within 2% relative error). For the mature vineyard (north block), TSEB-2T performs better than TSEB-PT in partitioning the canopy latent heat flux with 6.8% relative error and soil latent heat flux with 21.7% relative error; however, for the younger vineyard (south block), TSEB-PT performs better than TSEB-2T in partitioning the canopy latent heat flux with 11.7% relative error and soil latent heat flux with 39.3% relative error.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.