Open Access
6 March 2021 Use of high-resolution unmanned aerial systems imagery and machine learning to evaluate grain sorghum tolerance to mesotrione
Isaac Barnhart, Sushila Chaudhari, Balaji A. A. Pandian, P. V. Vara Prasad, Ignacio A. Ciampitti, Mithila Jugulam
Author Affiliations +
Abstract

Manual evaluation of crop injury to herbicides is time-consuming. Unmanned aerial systems (UAS) and high-resolution multispectral sensors and machine learning classification techniques have the potential to save time and improve precision in the evaluation of herbicide injury in crops, including grain sorghum (Sorghum bicolor L. Moench). The objectives of our research are to (1) evaluate three supervised classification algorithms [support vector machine (SVM), maximum likelihood, and random forest] for categorizing high-resolution UAS imagery to aid in data extraction and (2) evaluate the use of vegetative indices (VIs) collected from UAS imagery as an alternative to traditional methods of visible herbicide injury assessment in mesotrione-tolerant grain sorghum breeding trials. An experiment was conducted in a randomized complete block design using a factorial treatment arrangement of three genotypes by four mesotrione doses. Herbicide injury was rated visually on a scale of 0 (no injury) to 100 (complete plant mortality). The UAS flights were flown at 9, 15, 21, 27, and 35 days after treatment. Results show the SVM algorithm to be the most consistently accurate, and high correlations (r  =    −  0.83 to −0.94; p  <  0.0001) were observed between the normalized difference VI and ground-measured herbicide injury. Therefore, we conclude that VIs collected with UAS coupled with machine learning image classification has the potential to be an effective method of evaluating mesotrione injury in grain sorghum.

1.

Introduction

Grain sorghum (Sorghum bicolor L. Moench) is an important crop with diverse uses throughout the world.1 Sorghum is known to have originated from sub-Saharan Africa, where it has been used as a major food crop.2 Grain sorghum has a vast genetic diversity with many genotypes possessing agronomically desirable traits.3 Traditionally, sorghum is used in the Western Hemisphere as animal feed; however, other uses include biofuel, forage production, and as a gluten-free alternative for human consumption.3,4 In the United States, grain sorghum ranks the fifth-most important grain crop with 1.9 million hectares harvested in 2019.5 As sorghum is adapted to semi-arid regions of the world, it has been shown to have distinct yield advantages over corn (Zea mays L.) in such environments.6

Weed competition is a common biotic pestilence that interferes with grain sorghum production. Prior research has demonstrated that infestations of weeds in grain sorghum can reduce yields ranging from 8% to 56%, depending on the type of weeds.7 Despite other methods of weed control such as crop rotation and row cultivation, herbicides are the most common method of weed control in grain sorghum in the United States.7,8 Several herbicides are registered for use in grain sorghum as pre (PRE)- or post (POST)-emergence treatments. For example, herbicides such as atrazine and mesotrione can be used as a PRE, and 2,4-D and dicamba as POST treatments.9 Despite having several herbicides used for PRE treatments, the options for POST treatments are limited in grain sorghum, especially those used for grass weed control.

The 4-hydroxyphenylpyruvate dioxygenase (HPPD)-inhibiting herbicides (e.g., mesotrione and tembotrione) are widely used for POST emergence grass weed control in corn10 but not registered for POST use in grain sorghum because of crop injury.11 These herbicides are used both as soil and foliar applied, and control a broad spectrum of grass and broadleaf species while providing soil residual activity for extended protection.10,12,13 These herbicides inhibit the HPPD enzyme in the plastoquinone biosynthesis pathway, leading to the depletion of plastoquinone levels. This results in the inhibition of carotenoids biosynthesis and subsequent plant death by photo-oxidation of chloroplasts. Because of this photo-oxidation, the main symptom after treatment with these herbicides is the bleaching of plant tissue, although additional symptoms including stunting of growth, leaf chlorosis, and necrosis are also common in susceptible plants.8,10 HPPD-inhibiting herbicides are widely used as POST in corn, which can actively metabolize these herbicides into non-phytotoxic compounds.10

Mesotrione is an HPPD-inhibitor in the triketone chemical family14 and can be used as a pre-emergence treatment in grain sorghum. However, it has been shown to cause damage including 20% chlorosis in sorghum when applied POST.8,15 Recent research has identified grain sorghum genotypes with elevated tolerance to post-emergence applications of mesotrione with minimal crop damage.16 These grain sorghum genotypes are valuable for the development of mesotrione-tolerant varieties in breeding programs. Although herbicide-tolerant traits could be quite useful for growers, implementing herbicide-tolerance to grain sorghum breeding operations can be difficult. This is because several combinations of herbicides rates and genetic lines need to be fully investigated to quantify the data of herbicide injury level.17 In addition, previous methods to assess plant responses to herbicides, including visual scoring, portable chlorophyll sensors, and biomonitoring1820 are labor-intensive and time-consuming. New methods of evaluation of herbicide damage are warranted to support and hasten the screening large number of genotypes and populations in the breeding programs for the development of herbicide-tolerant technology in grain sorghum.

The use of high-resolution remote sensing techniques coupled with machine learning image classification algorithms is one of the promising methods for quantification of plant response to various biotic or abiotic stresses.21,22 Image classification via remote sensing has been explored extensively, with most platforms using either satellite, piloted aircraft, or unmanned aerial systems (UAS).19,23 Images collected from these platforms can then be separated into distinct classes through various algorithms, allowing extraction of data from individual classes for external analysis.2426 A highly accurate set of image classification includes a supervised approach, which uses user-defined training samples to make these feature classifications.23,27 In addition to image classification, vegetative indices (VIs) such as the normalized difference vegetation index (NDVI) can be computed, which have been shown to be useful in predicting plant parameters such as biomass, nitrogen status, and chlorophyll content.2830 The quality of the data enhances with improved image resolution, making UAS a useful platform for collecting high-resolution remote sensing imagery in agricultural settings.3134 Relating to weed science, proposed uses of UAS imagery include weed pressure mapping, quantifying herbicide damage on non-target crops, herbicide applications, and site-specific weed management.32,33,3538

The remainder of this paper is as follows. Section 2 expands upon related work focusing on investigating herbicide injury with remote sensing methods. In Sec. 3, we present our research methodologies, with descriptions of the field experiment, data collection methods, image processing, and statistical analyses. Section 4 highlights our results, and a discussion of our results and proposed limitations are presented in Sec. 5. The final conclusions and suggested steps for future work are given in Sec. 6.

2.

Related Work

Remote sensing has been previously used to identify herbicide injury symptoms in many studies, with crop damage due to off-target movement of commercial herbicides being a leading area of interest. Such injury has been detected with both multispectral and hyperspectral data.3942 Using hyperspectral data, Henry et al.36 were successfully able to classify between injured and uninjured plants after applications of the non-selective herbicides glyphosate and paraquat. Thelen et al.43 found that both digital aerial imagery and optical remote sensing ground-based equipment were able to determine herbicide injury to soybeans, but neither types of equipment could estimate specific herbicide application rates. Dicke et al.35 found reductions in NDVI values in plots treated with sulfonylurea herbicides as well as significant correlations between NDVI values and corn yield, suggesting that low NDVI values in herbicide-damaged plots can be used as predictor of final yield. Huang et al.44 found that NDVI values were highly related to cotton (Gossypium hirsutum L.) injury from aerial applications of glyphosate, with very strong correlations observed between increased frequencies of spray drift droplets and NDVI values. To add, the majority of these studies rely on ground-based or aerial sensors; however, Pause et al.45 found that NDVI data obtained from satellite imagery were successful in detecting vegetation injury from glyphosate herbicide products. In addition to crop damage, sensors have also been used to detect and quantify herbicide damage to weeds such as waterhyacinth (Eichhornia crassipes Mart.), with the best injury detections being observed after a period of 2 weeks after herbicide application.46 Another study by Henry et al.47 found that soybeans (Glycine max L. Merr.) could be differentiated from common cocklebur (Xanthium strumarium L.), hemp sesbania (Sesbania exaltata Raf.), pitted morning glory (Ipomoea lacunose L.), and sicklepod (Senna obtusifolia L.) after pre- and post-emergence herbicide applications.

In addition to these applications, remote sensing has been investigated as an alternative to visual herbicide injury ratings by Duddu et al.48 In this study, the authors reported strong correlations between the optimized soil-adjusted vegetation index and visual injury ratings of faba beans (Vicia faba L.) treated with nine different herbicide tank mixtures. For this study, faba bean visual injury ratings were based on visible growth reduction and tissue chlorosis. These results suggest that high-resolution remote sensing data could replace visual injury ratings in future work.48 Despite these findings, and to the best of our knowledge, there are few studies investigating the exclusive use of remote sensing for herbicide-tolerance traits in large-scale breeding operations.49 In addition, very few studies have been carried out using UAS remote sensing imagery to quantify herbicide-tolerance in grain sorghum. Therefore, the objectives of this study were to (1) specify a methodology to aid in the image processing, data extraction, and statistical analysis of mesotrione-tolerant herbicide trials, including evaluating three algorithms for image classification, and (2) investigate the use of UAS imagery as an alternative to traditional methods of assessing visual mesotrione injury in grain sorghum. Compared to existing methodologies for evaluating herbicide injury, which can be labor-intensive and time-consuming,50 we believe this methodology outlines steps for image processing and machine learning image classification to allow sorghum breeders to screen large numbers of plots in herbicide breeding trials. Plots of interest can be located from the aerial imagery and evaluated for traits of interest, thus reducing the need to physically evaluate the entire plot. With UAS and high-resolution remote sensing technologies, thousands of plots can be photographed in a relatively short time;49,51,52 therefore, this methodology could allow for decreased time and labor costs for breeders looking for mesotrione-tolerant traits.

3.

Materials and Methods

3.1.

Field Experiment

A field study was conducted in 2019 at the Kansas State University Ashland Bottoms Research Farm in Manhattan, Kansas (Fig. 1). The soil type at the study location was a Reading silt loam with 2.42% organic matter and a pH of 6.07. A grain sorghum genotype (G-1) identified with tolerance to mesotrione16 was planted along with a known mesotrione-susceptible genotype (S-1) and a Pioneer® (Corteva Agriscience, Wilmington, Delaware) sorghum hybrid (84G62) for comparison. The experiment was arranged in a randomized complete block design (with three replications) with a two-way factorial arrangement of treatments, consisting of three genotypes and four herbicide rates. The genotypes were planted on June 8, 2019, at a rate of 173,000  seedsha1, with a row spacing of 76 cm and a planting depth of 5 cm. The experiment plots were 2 m wide and 6.5 m long, consisting of four sorghum rows. The two middle rows consisted of each genotype (G-1, S-1, and 84G62), and two outside border rows of Pioneer 84G62 were planted to act as a buffer between treatments. Mesotrione (Callisto® SC; Syngenta Crop Protection, LLC, Greensboro, North Carolina) rates of 0, 105, 420, and 840  gaiha1 were applied at the 3 to 4 leaf growth stage using a CO2-pressurized backpack sprayer consisting of a four-nozzle boom fitted with 110-02 flat-fan nozzles (TeeJet Spraying Systems Co., Wheaton, Illinois) calibrated to deliver 187l  ha1 at 276 kPa. Fertilizer applications followed the recommendations determined from a pre-plant soil test, and pests and diseases were controlled on an as-needed basis.

Fig. 1

Location of the field trial in Manhattan, Kansas, evaluating 84G62, S-1, and G-2 spectral responses to mesotrione. The genotypes were planted in two rows, as denoted by the red rectangles. Plot numbers are denoted below each rectangle.

JARS_15_1_014516_f001.png

3.2.

Image Acquisition

Prior to seedling emergence, 10 ground control points (GCPs) were placed within the experiment, and real-time kinematic points were collected to aid in image processing. In addition to GCPs, each plot received two diagonally placed ground targets indicating the sub-plots where ground-truth measurements were taken. Dimensions of the sub-plots were approximately 1.5  m×1  m. After processing, geo-centered border polygons could then be drawn over these sub-plots to allow for data extraction (Fig. 2).

Fig. 2

A view of an individual plot consisting of sorghum rows and a sub-plot. Sub-plots were indicated by two ground targets placed within the north end of each plot to identify the plants on which ground-truth injury ratings were taken. Sub-plot dimensions were approximately 1.5  m×1  m. Border polygons were drawn over each sub-plot to allow for data extraction.

JARS_15_1_014516_f002.png

The imagery was collected with a DJI Matrice 200 (DJI Inc., Shenzhen, China) multi-rotor aircraft equipped with a Micasense RedEdge-MX multispectral camera (Micasense Inc., Seattle, Washington53). The camera is capable of capturing five spectral bands of the visible and invisible electromagnetic spectrum [blue: 465 to 485 nm; green: 550 to 570 nm; red: 663 to 673 nm; red edge: 712 to 722 nm; near-infrared (NIR): 820 to 860 nm54]. Each band is captured independently with a separate camera lens and is capable of capturing images with a spatial resolution of 8  cm/pixel at an altitude of 120 m. To measure fluctuations in ambient lighting conditions, a downwelling light sensor was included on the top of the aircraft.

Flights were flown 9, 15, 21, 27, and 35 days after treatment (DAT) for data collection. Flights were flown under clear, sunny conditions with no cloud cover. These conditions were selected to standardize lighting across all measurement dates, ensuring differences in data due to lighting conditions would be minimized. Flight dates were chosen to obtain data as close to weekly measurements as possible, as dictated by our chosen lighting conditions. All flights were taken within ±2.5  h of solar noon as recommended by the camera manufacturers. Radiometric calibration was performed using the calibration panel before and after each flight to ensure image quality. Each flight was flown at an altitude of 15 m and was flown using a pre-programmed GPS waypoint mapping mission using the DJI Pilot application. To ensure image quality, each flight was flown with an 80% front overlap and a 75% side overlap. The camera was set to capture images every 2 s to ensure adequate amounts of images were taken for processing. Images collected in flight were stored on an SD card and transferred to a desktop computer for further processing.

3.3.

Image Processing

Data processing followed a workflow involving image orthomosaic generation in Agisoft Metashape (Agisoft LLC., St. Petersburg, Russia) and data extraction in ArcGIS Pro (ESRI Headquarters, Redlands, California) (Fig. 3). UAS images processed using Metashape were processed with the slight modifications discussed in Holman et al.55 When aligning images and generating the sparse point cloud, alignment accuracy was set to high as opposed to ultrahigh to reduce processing time and decrease levels of noise. When generating the dense point cloud, depth filtering was disabled to prevent any smoothing of the sorghum canopy. In addition, the quality of the dense point cloud was also set to high instead of ultrahigh to reduce processing time. The resulting orthomosaic spatial resolution was between 1.1 and 1.3  cm/pixel for each of the five flight dates.

Fig. 3

Image processing and data extraction workflow.

JARS_15_1_014516_f003.png

The image orthomosaic was exported to ArcGIS Pro for further data analysis. Ground targets within each plot were located, and sub-plots were defined. The sub-plots were then clipped from the original image using the “Clip Raster” tool, and VIs were computed on the sub-plots. The VIs computed were NDVI, enhanced normalized difference vegetation index (ENDVI), simple ratio (SR), and enhanced vegetation index 2 (EVI2) (Table 1). These VIs were chosen for their close relations to plant greenness and canopy chlorophyll content.59

Table 1

Equations and common uses for VIs used in this study.

VIFormulaUseReference
NDVINIRRNIR+RGreen biomass, chlorophyll54, 5657.58
ENDVI((NIR+G)(2×R))((NIR+G)+(2×R))Chlorophyll content58 and 59
SRNIRRChlorophyll, leaf area index60
EVI22.5×NIRRNIR+(2.4×R)+1Chlorophyll, canopy cover57, 59, 61, and 62
Note: G, green; R, red; NIR, near-infrared.

3.4.

Machine Learning Classification

A supervised classification approach was chosen over unsupervised methods because of the classification accuracy advantage over unsupervised methods.26,27 Due to high image resolution and standardized lighting conditions on each measurement date, we chose to use pixel-based classification algorithms, as opposed to other methods such as object based. After computing VI imagery, each sub-plot was classified into three categories: sorghum leaves, shadows, and soil. To test for differences between accuracies in classification algorithms, we chose three different algorithms: support vector machine (SVM), random forest (RF), and maximum likelihood (ML). These three algorithms were chosen because they are all the options offered for pixel-based classifications in ArcGIS Pro. To classify the imagery, we followed a similar approach described by Tay et al.62 and Makanza et al.63 First, a classification scheme consisting of three categories was created. Twenty representative training samples were taken at random from the sub-plot, grouping similar pixels based on visual similarity and assigning them to each class. The training dataset and classification schema were then saved as a signature file and used to classify each sub-plot using each machine learning algorithm. To achieve maximum accuracy across each measurement day, this process was repeated with new representative training samples generated for each day. All algorithms were run with the default settings provided by ArcGIS Pro.

3.5.

Ground Truth Measurements

Herbicide injury was assessed visually using a scale of 0 (no visual injury) to 100 (complete plant mortality). Injury ratings were based on the presence of foliar bleaching, growth stunting, leaf chlorosis, and tissue necrosis.8,10,64 Ratings were taken within sub-plots so data from flights could be taken from the same plants in which the ratings were assigned. Each rating was taken ±1 day of flight measurements.

Accuracy assessments were conducted on the classification methods by computing confusion matrices.65,66 For each measurement day, 300 accuracy assessment points were generated via stratified random sampling. Each point was manually designated to the class to which it belonged, relative to the original orthomosaic image, creating ground reference points. When overlaid onto each classified raster, overall accuracy (OA) percentages were then computed by dividing the total number of correctly classified pixels by the total number of reference pixels. Accuracies at or above 85% were considered to be accurate classifications, which is a commonly used target accuracy when classifying imagery.65 For this study, we selected the most consistently accurate algorithm across all five treatment dates to use for further data extraction (see Sec. 4).

3.6.

Data Extraction

The process of data extraction from each sub-plot is shown in Fig 4. To prevent extraction from background features, data from each sub-plot were extracted using a conditional statement using the “Con” tool. Using the selected classified image, a conditional statement was built to allow for data from each computed VI to be extracted only from the sorghum plants. The average value of each VI was then extracted and exported for further analysis.

Fig. 4

Methodology for VIs data extraction based on the results of the image classification. (a) Original red-green-blue (RGB) sub-plot image; (b) each VI was computed from the original RGB images, creating a new raster layer for each VI; (c) RGB images were classified via supervised, pixel-based algorithms into leaves, shadows, and soil; and (d) each VI raster was combined with the classified image to create a new raster layer, allowing for VI data from only the sorghum leaves to be extracted for statistical analysis.

JARS_15_1_014516_f004.png

3.7.

Statistical Analysis

All data were exported for analysis in R statistical program (R Core Team, Vienna, Austria67). Data analysis was completed in two stages. First, statistical differences between mean overall classification accuracies (per algorithm) were tested with the Shapiro–Wilks test to verify assumptions of analysis of variance (ANOVA).68 The classification accuracy dataset failed to meet ANOVA assumptions and was therefore analyzed using the Kruskal–Wallis chi-squared goodness of fit test.69 Significant differences between algorithms were found with a post-hoc analysis using the Dunn test.70 The Dunn test revealed that the RF algorithm was significantly less accurate than the SVM and ML algorithms in this study. Second, relationships between VIs and ground truth injury ratings were investigated and tested for significance with Pearson correlation coefficients.54,63,7173 Correlations were then used to detect differences in spectral values between genotypes and application rates. To examine statistical difference in spectral responses of genotypes, a two-way ANOVA was conducted comparing the effects of genotype/hybrid and rate on VI values, with means being separated using a Tukey honest significant difference test. All significance levels were set at α=0.05.

4.

Results

4.1.

Classification Accuracy

Overall classification accuracies ranged from 82% to 93% across algorithms, which can be seen in Table 2. The SVM was the most consistent across all dates with accuracies of 90% (27 DAT) to 92% (9 DAT). The algorithm with the most variability was ML, whereas the lowest accuracies were achieved with RF (82%, 15 DAT; 88%, 9 DAT). The Kruskal–Wallis test revealed significant differences between overall accuracies of the classification algorithms generated by the confusion matrices (chi-square=7.41, p-value=0.025), and the RF algorithm was found to be less accurate when compared with SVM and ML (Table 3). Because SVM was consistently the most accurate classification algorithm, it was chosen to continue to extract data for further analysis. When comparing the OA means of each algorithm, all algorithms accurately classified the imagery into three established classes (leaves, shadows, and soil), as compared to the threshold of 85%. However, RF was found to be significantly less accurate than SVM and ML, suggesting that RF may not be the best choice in similar supervised, pixel-based classifications.

Table 2

Overall accuracies for each algorithm on each measurement date.

DATAlgorithm
SVMRFML
9928885
15918292
21918793
27908792
35918290
Note: DAT, days after treatment; SVM, support vector machine; RF, random forest; ML, maximum likelihood. Accuracies are expressed in percentages.

Table 3

Results of the Dunn test indicating significant differences between mean overall classification accuracy percentages. No significant differences were found between the SVM–ML algorithms, but the RF algorithm was significantly less than the others.

AlgorithmP-valueSignificance
ML-RF0.02*
ML-SVM0.94ns
RF-SVM0.02*
Note: ML, maximum likelihood; RF, random forest; SVM, support vector machine.Asterisks indicate significant differences (p<0.05); “ns” indicates non-significance.

4.2.

Relation of Vegetative Indices to Ground Truth Injury

Significant negative correlations were observed across all measurement dates with each VI. High negative correlations were observed between UAS-based VI data and ground truth herbicide injury across all measurement dates (Table 4). The NDVI coefficients (r=0.94 to 0.83) consistently displayed the highest correlations with visual injury symptoms on every measurement day. ENDVI (r=0.92 to 0.82) and EVI2 (r=0.94 to 0.82) correlations were very similar to NDVI values and thus were also very highly correlated with injury symptoms. The SR demonstrated the most variability in each measurement date (r=0.92 to 0.70). In addition to VI data, individual spectral bands were tested for significance with ground-measured injury scores, with significances not higher than what was determined for VIs (data not shown).

Table 4

Pearson correlation coefficients between VIs and ground truth visual injury ratings.

VI9 DAT15 DAT21 DAT27 DAT35 DAT
rp-valuerp-valuerp-valuerp-valuerp-value
NDVI0.94<0.00010.91<0.00010.85<0.00010.83<0.00010.83<0.0001
ENDVI0.92<0.00010.90<0.00010.84<0.00010.83<0.00010.82<0.0001
EVI20.94<0.00010.90<0.00010.84<0.00010.81<0.00010.82<0.0001
SR0.92<0.00010.83<0.00010.79<0.00010.68<0.00010.70<0.0001

As NDVI was shown to be the most consistently related to mesotrione injury, it was chosen for the ANOVA analysis. Across all treatment dates, there was no significant interaction effect between genotype and rate (data not shown). Despite this, the main effect models demonstrated significant differences, with the results of the post-hoc test shown in Fig. 5. When looking at the main effects model for genotypes, the NDVI response for the G-1 genotype across all mesotrione doses was not statistically different from the commercial hybrid response until 35 DAT. However, in all cases, the NDVI values for the G-1 genotype and commercial hybrid were statistically different from the S-1 genotype, which was to be expected given the S-1 susceptibility to mesotrione. For the rate main effect model, plants treated with 0 and 105  gaeha1 were not statistically different from one another on each measurement date, indicating that a low dose of mesotrione did not significantly injure the lines used in this study. Interestingly, on the 27 and 35 DAT, it appeared that the plots treated with a higher rate of mesotrione (420 and 840  gaiha1) started to recover from their injuries but were always significantly lower than the control and 105  gaeha1 dose.

Fig. 5

ANOVA results for main effect models, genotype, and rate. Means were separated using Tukey’s honest significant difference test at the 0.05 significance level, and error bars denote standard error of the mean.

JARS_15_1_014516_f005.png

5.

Discussion

UAS imagery has been used to monitor various agronomic traits, including plant response to stress.63 Supervised machine learning algorithms are commonly used to classify remote sensing imagery and have been shown to be highly accurate in terms of overall classification accuracies.24,74 Our current research shows that machine learning algorithms, most notably the SVM algorithm, could be an effective method of extracting VI values from grain sorghum treated with different rates of mesotrione (Table 3).

The key to this finding is the use of high-resolution UAS imagery coupled with plots across uniform lighting conditions. Within each plot, there were three classes identified in each classification schema: leaves, shadows, and soil. A stark contrast between pixel brightness values occurred throughout each of the classes, allowing for the SVM algorithm to effectively segregate each class throughout the plots with high accuracy percentages. As SVM and ML were not statistically different in terms of OA, relationships between VI data and ground-measured injury are not expected to be statistically different if ML was to be used instead of SVM for data extraction. We expect that this classification schema could be used in grain sorghum to assess injury of other HPPD-inhibitor herbicides, such as tembotrione, because similar classes would be expected, regardless of the location of study or genotypes. As pixel-based classifications were studied, we did not compare these results with other methods of segregating vegetation from background noise, including using the Otsu algorithm75 for binary thresholding and object-based image analysis. Further research for vegetation classification in screening for herbicide-tolerance studies should compare classification accuracies of pixel-based, object-based, and binary thresholding algorithms.

The classification method chosen for this study was supervised, which is often more accurate than unsupervised in some situations.27 One of the main benefits of unsupervised classifications is that large areas of land can be classified in a very short time frame.76 In addition, as supervised classification methods require a training dataset to be collected from the imagery itself, unsupervised classification algorithms would be able to bypass this step and potentially hasten the classification process.77 However, as demonstrated by our study, segregating sorghum vegetation in herbicide breeding trials does not require many classes or training samples due to the relative homogeneity of features within the field. Twenty training samples were used here, as is the recommended minimum sample size for parametric classifiers such as ML.78 However, it is entirely possible that an even smaller amount of samples could be used when using SVM, which is much less sensitive to background interference and can be used when few training samples are present.79 As supervised classification algorithms performed well in this study, more research is needed to compare supervised classifications with unsupervised classifications in herbicide-tolerance trials to see if statistically significant differences exist between overall accuracies.

Using the NDVI values and an ANOVA, it was observed that differences in spectral values existed for genotypes with respect to mesotrione application rates. The ability to detect spectral responses of plots treated with mesotrione could aid in quickly assessing the variable responses of multiple sorghum genotypes to mesotrione treatment. To be useful for sorghum breeders, differences in plant responses to mesotrione must be detected by the camera and able to be seen after data extraction. Even though the EVI2 and ENDVI indices were found to be similar in correlations with NDVI, we chose to evaluate only the NDVI as it is one of the most widely used VIs.80 This study could be improved upon by testing other indices in future studies. Our data demonstrate that a multispectral camera was able to detect differences among genotype responses, which could allow breeders to know which genotypes are responding to herbicide treatment. This prevents a large-scale assessment of the entire operation, leading to reduced time, labor, and resources to achieve the desired objective.

It should be noted that all VIs showed the strongest correlations on 9 and 15 DAT, and showed slightly weaker correlations for the rest of the measurement days. This is most likely because reductions in chlorophyll due to herbicide injury were clearly visible on the canopy early during the experiment, but were overshadowed by new growth as the growing season progressed. As a result, injury symptoms were still visible during ground truth ratings but were not visible to the camera. As healthy plants absorb light in the visible spectrum and reflect NIR energy, higher NDVI values are observed with healthier plants, whereas lower NDVI values are recorded as plant health deteriorates.36 This explains the negative correlations, as plants injured by mesotrione applications experienced a loss of green pigmentation.81 In addition, bleaching symptoms due to mesotrione were not directly classified with machine learning algorithms due to spectral similarities with soil values. Instead, the reduction in leaf greenness showed to be sufficient in determining correlations with ground truth injury ratings.

VIs have been successfully correlated with herbicide injury involving chlorophyll pigmentation loss in previous research.35,44,45,48 Our data are consistent with these findings in that the increase in herbicide foliar injury is strongly related to changes in VI scores. As a final recommendation for future studies, relationships between UAS data and other herbicide modes of action should be determined to investigate how useful UAS would be for determining grain sorghum tolerance to other herbicides.

Herbicide injury to both weeds and crops in breeding trials is traditionally assessed with visual injury ratings, plant mortality, and biomass reductions, which can either be prone to error due to variations among evaluators or too time-consuming for large-scale evaluations.50 Alternatively, UAS imagery can be taken in as little as 20 to 25 min, and while the data extraction methodology may seem to take a long time, the methodology can actually be completed in short time once the process is established.

Limitations of this study that could be addressed in future studies are centered on the lack of use of more herbicides, genotypes tested, and geographic restrictions. As there was only one herbicide used in this study, it remains to be determined if tolerance/susceptibility to other herbicides could be detected with UAS. Mesotrione injury was well-detected due to foliar bleaching symptoms, resulting from the destruction of phyto-oxidation of chloroplasts. This resulted in stark differences between treated and untreated plants, which were able to be detected with the camera due to differences in chlorophyll pigmentation loss. As other herbicides do not directly result in stark pigmentation losses (i.e., auxins), the ability to detect sorghum responses to these herbicides remains to be determined.

Additional limitations involve use of genotypes that are not bred similarly for cultivation. While Pioneer 84G62 is a commercially grown hybrid with desirable agronomic traits, the genotypes G-1 and S-1 represent sorghum diversity panel, which do not possess agronomically suitable traits. The other limitation includes use of only one year of data at one geographic location. This experiment was conducted in one field (Manhattan, Kansas) and has only been replicated across one year. This methodology remains to be tested across multiple years/locations to determine the robustness of detecting differences in mesotrione tolerance. Further studies should determine UAS injury characterization across multiple locations and growing conditions.

Although significant spectral difference existed between certain genotypes and rates of herbicide applications, the differences were not as pronounced as would be expected between mesotrione tolerant and susceptible genotypes. A likely explanation can be found in precipitation patterns. Rainfall was plentiful for the 2019 growing season in central Kansas, enabling adequate growth and development for all sorghum lines studied. This further demonstrates the need for additional replications across different geographic regions, as greater spectral differences between tolerant and susceptible genotypes could potentially be observed in different growing conditions.

6.

Conclusion

This study suggests a methodology to aid in the image processing, data extraction, and statistical analysis for future research with multispectral evaluation of herbicide-tolerance in grain sorghum. Additionally, we suggest that VIs (notably NDVI in this study) collected from high-resolution imagery, coupled with image classification, may be a useful tool for sorghum breeders to quantify the degree of mesotrione injury in grain sorghum. Across all measurement dates, a supervised, pixel-based SVM classifier was shown to be accurate and consistent when classifying imagery into sorghum leaves, shadows, and soil. The high correlation between VI data and ground-measured injury scores indicates that high-resolution multispectral imagery can detect reductions in chlorophyll due to mesotrione injury. As NDVI values were shown to be the most correlated to herbicide injury, we have demonstrated that the most widely used VI in agriculture is capable of detecting these changes. Therefore, cost-effective NDVI sensors of varying types could potentially be used to measure mesotrione injury or possibly other herbicide injury symptoms that can be quantifiable. Further research is necessary to determine the ability of different sensors to measure such injury, as well as compare other means of vegetation segmentation such as using the Otsu algorithm or object-based image analysis. Additional research is also necessary to determine if high-resolution imagery is capable of detecting damage symptoms of herbicides with differing modes of action.

References

1. 

A. Stefoska-Needham et al., “Sorghum: an underutilized cereal whole grain with the potential to assist in the prevention of chronic disease,” Food Rev. Int., 31 (4), 401 –437 (2015). https://doi.org/10.1080/87559129.2015.1022832 FRINEL 8755-9129 Google Scholar

2. 

P. Ongom, “Association mapping of gene regions for drought tolerance and agronomic traits in sorghum,” Purdue University, West Lafayette, Indiana, (2015). https://docs.lib.purdue.edu/dissertations/AAI10190817/ Google Scholar

3. 

L. W. Rooney and J. M. Awika, “Overview of products and health benefits of specialty sorghums,” Cereal Food World, 50 (3), 109 –115 (2005). Google Scholar

4. 

J. R. N. Taylor et al., “Novel food and non-food uses for sorghum and millets,” J. Cereal Sci., 44 (3), 252 –271 (2006). https://doi.org/10.1016/j.jcs.2006.06.009 JCSCDA 1095-9963 Google Scholar

6. 

S. Staggenborg et al., “Grain sorghum and corn comparisons: yield, economic, and environmental responses,” Agron. J., 100 (6), 1 –21 (2008). https://doi.org/10.2134/agronj2008.0129 AGJOAT 0002-1962 Google Scholar

7. 

D. W. Brown et al., “Safening of metsulfuron grain sorghum injury with growth regulator herbicides,” Weed Sci., 52 (3), 319 –325 (2004). https://doi.org/10.1614/P2002-074 WEESA6 0043-1745 Google Scholar

8. 

M. J. M. Abit and K. Al-Khatib, “Absorption, translocation, and metabolism of mesotrione in grain sorghum,” Weed Sci., 57 (6), 563 –566 (2009). https://doi.org/10.1614/WS-09-041.1 WEESA6 0043-1745 Google Scholar

9. 

D. Regehr, “Weed control,” Grain Sorghum Production Handbook, 10 –11 Kansas State University Agricultural Experiment Station and Cooperative Extension Service, Manhattan, Kansas (1998). Google Scholar

10. 

G. Mitchell et al., “Mesotrione: a new selective herbicide for use in maize,” Pest Manage. Sci., 57 (2), 120 –128 (2001). https://doi.org/10.1002/1526-4998(200102)57:2<120::AID-PS254>3.0.CO;2-E Google Scholar

11. 

M. J. M. Abit et al., “Effect of postemergence mesotrione application timing on grain sorghum,” Weed Technol., 24 (2), 85 –90 (2010). https://doi.org/10.1614/WT-09-067.1 WETEE9 Google Scholar

12. 

V. P. Singh et al., “Bioefficacy of tembotrione against mixed weed complex in maize,” Indian J. Weed Sci., 44 (1), 1 –5 (2012). Google Scholar

13. 

D. O. Stephenson et al., “Weed management in corn with postemergence applications of tembotrione or thiencarbazone: tembotrione,” Weed Technol., 29 (3), 350 –358 (2015). https://doi.org/10.1614/WT-D-14-00104.1 WETEE9 Google Scholar

14. 

M. M. Williams and J. K. Pataky, “Genetic basis of sensitivity in sweet corn to tembotrione,” Weed Sci., 56 (3), 364 –370 (2008). https://doi.org/10.1614/WS-07-149.1 WEESA6 0043-1745 Google Scholar

15. 

K. T. Horky and A.R. Martin, “Evaluation of preemergence weed control programs in grain sorghum,” 30 (2005) http://ncwss.org/proceed/2005/ResRep05/table_contents.html Google Scholar

16. 

M. Jugulam et al., “Characterization of HPPD-inhibitor-tolerant sorghum genotypes,” in Sorghum in the 21st Century, (2017). Google Scholar

17. 

R. G. Leon and B. L. Tillman, “Postemeregence herbicide tolerance variation in peanut germplasm,” Weed Sci., 63 (2), 546 –554 (2015). https://doi.org/10.1614/WS-D-14-00128.1 WEESA6 0043-1745 Google Scholar

18. 

A. S. Felsot et al., “Biomonitoring with sentinel plants to assess exposure of nontarget crops to atmospheric deposition of herbicide residues,” Environ. Toxicol. Chem., 15 (4), 452 –459 (1996). https://doi.org/10.1002/etc.5620150407 ETOCDK 0730-7268 Google Scholar

19. 

W. B. Sea, N. Sykes and P. O. Downey, “An assessment of the physiological response of weeds to herbicide application,” Plant Prot. Q., 28 132 –138 (2013). PPQUE8 Google Scholar

20. 

J. F. Weber et al., “Utilization of chlorophyll fluorescence imaging technology to detect plant injury by herbicides in sugar beet and soybean,” Weed Technol., 31 (4), 523 –535 (2017). https://doi.org/10.1017/wet.2017.22 WETEE9 Google Scholar

21. 

C. V. M. Barton, “Advances in remote sensing of plant stress,” Plant Soil, 354 41 –44 (2012). https://doi.org/10.1007/s11104-011-1051-0 Google Scholar

22. 

A. Singh et al., “Machine learning for high-throughput stress phenotyping in plants,” Trends Plant Sci., 21 (6), 110 –124 (2016). https://doi.org/10.1016/j.tplants.2015.10.015 Google Scholar

23. 

J. M. Peña et al., “Object-based image classification of summer crops with machine learning methods,” Remote Sens., 6 (6), 5019 –5041 (2014). https://doi.org/10.3390/rs6065019 Google Scholar

24. 

J. R. Otukei and T. Blaschke, “Land cover change assessment using decision trees, support vector machines and maximum likelihood classification algorithms,” Int. J. Appl. Earth Obs., 12 (1), S27 –S31 (2010). https://doi.org/10.1016/j.jag.2009.11.002 Google Scholar

25. 

A. Smith, “Image segmentation scale parameter optimization and land cover classification using the random forest algorithm,” J. Spatial Sci., 55 (1), 69 –79 (2010). https://doi.org/10.1080/14498596.2010.487851 Google Scholar

26. 

L. Thai, T. S. Hai and N. T. Thuy, “Image classification using support vector machine and artificial neural network,” Int. J. Inf. Technol. Comput. Sci., 4 (5), 32 –38 (2012). https://doi.org/10.5815/ijitcs.2012.05.05 Google Scholar

27. 

M. D. Bah, A. Hafiane and R. Canals, “Deep learning with unsupervised data labeling for weed detection in line crops in UAV images,” Remote Sens., 10 (11), 1690 (2018). https://doi.org/10.3390/rs10111690 Google Scholar

28. 

M. Kulbacki et al., “Survey of drones for agriculture automation from planting to harvest,” in Proc. 2018 IEEE 22nd Int. Conf. Intell. Eng. Syst., 353 –358 (2018). https://doi.org/10.1109/INES.2018.8523943 Google Scholar

29. 

A. S. Milas et al., “The importance of leaf area index in mapping chlorophyll content of corn under different agricultural treatments using UAV images,” Int. J. Remote Sens., 39 (15–16), 5415 –5431 (2018). https://doi.org/10.1080/01431161.2018.1455244 IJSEDK 0143-1161 Google Scholar

30. 

R. Näsi et al., “Assessment of various remote sensing technologies in biomass and nitrogen content estimation using an agricultural test field,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-3/W3 137 –141 (2017). https://doi.org/10.5194/isprs-archives-XLII-3-W3-137-2017 1682-1750 Google Scholar

31. 

A. I. de Castro et al., “An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery,” Remote Sens., 10 (2), 285 (2018). https://doi.org/10.3390/rs10020285 Google Scholar

32. 

F. López-Granados et al., “Using remote sensing for identification of late-season grass weed patches in wheat,” Weed Sci., 54 (2), 346 –353 (2006). https://doi.org/10.1614/WS-05-54.2.346 WEESA6 0043-1745 Google Scholar

33. 

J. Rasmussen et al., “Potential uses of small unmanned aircraft systems (UAS) in weed research,” Weed Res., 53 (4), 242 –248 (2013). https://doi.org/10.1111/wre.12026 WEREAT 1365-3180 Google Scholar

34. 

J. Torres-Sánchez et al., “Configuration and specifications of an unmanned aerial vehicle (UAV) for early site specific weed management,” PLoS One, 8 (3), e58210 (2013). https://doi.org/10.1371/journal.pone.0058210 POLNCL 1932-6203 Google Scholar

35. 

D. Dicke, J. Jacobi and A. Buchse, “Quantifying herbicide injuries in maize by use of remote sensing,” in Proc. 25th German Conf. Weed Biol. and Weed Control, 199 –205 (2012). https://doi.org/10.5073/jka.2012.434.024 Google Scholar

36. 

W. B. Henry et al., “Remote sensing to detect herbicide drift on crops,” Weed Technol., 18 (2), 358 –368 (2004). https://doi.org/10.1614/WT-03-098 WETEE9 Google Scholar

37. 

M. Louargant et al., “Weed detection by UAV: simulation of the impact of spectral mixing in multispectral images,” Precis. Agric., 18 (6), 932 –951 (2017). https://doi.org/10.1007/s11119-017-9528-3 Google Scholar

38. 

J. M. Prince Czarnecki et al., “Applications of unmanned aerial vehicles in weed science,” in Proc. 11th Eur. Conf. Precis. Agric., 807 –811 (2017). https://doi.org/10.1017/S2040470017001339 Google Scholar

39. 

Y. Huang et al., “Assessment of soybean injury from glyphosate using airborne multispectral remote sensing,” Pest Manage. Sci., 71 (4), 545 –552 (2015). https://doi.org/10.1002/ps.3839 Google Scholar

40. 

Y. Huang et al., “Detection of crop herbicide injury through plant hyperspectral remote sensing of chlorophyll fluorescence,” in IEEE Int. Geosci. and Remote Sens. Symp., 5069 –5072 (2017). https://doi.org/10.1109/IGARSS.2017.8128142 Google Scholar

41. 

Y. Huang et al., “Ground-based hyperspectral remote sensing for weed management in crop production,” Int. J. Agric. Biol. Eng., 9 (2), 98 –109 (2016). https://doi.org/10.3965/j.ijabe.20160902.2137 Google Scholar

42. 

Y. Huang et al., “In-situ plant hyperspectral sensing for early detection of soybean injury from dicamba,” Biosyst. Eng., 149 51 –59 (2016). https://doi.org/10.1016/j.biosystemseng.2016.06.013 Google Scholar

43. 

K. D. Thelen, A. N. Kravchenko and C. D. Lee, “Use of optical remote sensing for detecting herbicide injury in soybean,” Weed Technol., 18 (2), 292 –297 (2004). https://doi.org/10.1614/WT-03-049R2 WETEE9 Google Scholar

44. 

Y. Huang et al., “Airborne remote sensing assessment of the damage to cotton caused by spray drift from aerially applied glyphosate through spray deposition measurements,” Biosyst. Eng., 107 (3), 212 –220 (2010). https://doi.org/10.1016/j.biosystemseng.2010.08.003 Google Scholar

45. 

M. Pause et al., “Monitoring glyphosate-based herbicide treatment using Sentinel-2 time series—a proof-of-principle,” Remote Sens., 11 (21), 2541 (2019). https://doi.org/10.3390/rs11212541 Google Scholar

46. 

W. Robles, J. D. Madsen and R. M. Wersal, “Potential for remote sensing to detect and predict herbicide injury on waterhyacinth (Eichhornia crassipes),” Invas. Plant Sci. Mana., 3 (4), 440 –450 (2010). https://doi.org/10.1614/IPSM-D-09-00040.1 Google Scholar

47. 

W. B. Henry et al., “Remote sensing to distinguish soybean from weeds after herbicide application,” Weed Technol., 18 (3), 594 –604 (2004). https://doi.org/10.1614/WT-03-097R WETEE9 Google Scholar

48. 

H. S. Duddu et al., “High-throughput UAV image-based method is more precise than manual rating of herbicide tolerance,” Plant Phenom., 2019 6036453 (2019). https://doi.org/10.34133/2019/6036453 Google Scholar

49. 

I. H. Barnhart, “High-resolution UAS multispectral imaging for cultivar selection in grain sorghum breeding trials,” Kansas State University, Manhattan, Kansas, (2020). https://krex.k-state.edu/dspace/handle/2097/40848 Google Scholar

50. 

A. R. da Silva et al., “Proximal sensing estimation of glyphosate injury on weeds in central Brazil,” J. Appl. Remote Sens., 13 (4), 044524 (2019). https://doi.org/10.1117/1.JRS.13.044524 Google Scholar

51. 

M. Reynolds and P. Langridge, “Physiological breeding,” Curr. Opin. Plant Biol., 31 162 –171 (2016). https://doi.org/10.1016/j.pbi.2016.04.005 Google Scholar

52. 

A. Mahlein, “Plant disease detection by imaging sensors—parallels and specific demands for precision agriculture and plant phenotyping,” Plant Dis., 100 (2), 241 –251 (2016). https://doi.org/10.1094/PDIS-03-15-0340-FE Google Scholar

53. 

Micasense, (2020) https://micasense.com/ February ). 2020). Google Scholar

54. 

J. Li et al., “Elucidating sorghum biomass, nitrogen and chlorophyll contents with spectral and morphological traits derived from unmanned aircraft system,” Front. Plant Sci., 9 1406 (2018). https://doi.org/10.3389/fpls.2018.01406 FOPSAC 0016-2167 Google Scholar

55. 

F. H. Holman et al., “High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing,” Remote Sens., 8 (12), 1031 (2016). https://doi.org/10.3390/rs8121031 Google Scholar

56. 

J. Gago et al., “UAVs challenge to assess water stress for sustainable agriculture,” Agric. Water Manage., 153 9 –19 (2015). https://doi.org/10.1016/j.agwat.2015.01.020 AWMADF 0378-3774 Google Scholar

57. 

J. Rouse et al., “Monitoring vegetation systems in the great plains with ERTS,” in Proc. NASA Goddard Space Flight Center 3d ERTS-1 Symp., 309 –317 (1974). Google Scholar

58. 

J. Rasmussen et al., “Are vegetation indices derived from consumer-grade cameras mounted on UAVs sufficiently reliable for assessing experimental plots?,” Eur. J. Agron., 74 75 –92 (2016). https://doi.org/10.1016/j.eja.2015.11.026 EJAGET Google Scholar

59. 

I. A. Garcia, “UAS multispectral imaging for detecting plant stress due to iron chlorosis in grain sorghum,” Texas A&M University – Corpus Christi, Corpus Christi, Texas, (2018). https://tamucc-ir.tdl.org/handle/1969.6/87102 Google Scholar

60. 

P. Stenberg et al., “Reduced simple ratio better than NDVI for estimating LAI in Finnish pine and spruce stands,” Silva Fenn., 38 (1), 3 –14 (2004). https://doi.org/10.14214/sf.431 Google Scholar

61. 

Z. Jiang et al., “Development of a two-band enhanced vegetation index without a blue band,” Remote Sens. Environ., 112 (10), 3833 –3845 (2008). https://doi.org/10.1016/j.rse.2008.06.006 Google Scholar

62. 

J. Y. Tay, A. Erfmeier and J. M. Kalwij, “Reaching new heights: can drones replace current methods to study plant population dynamics?,” Plant Ecol., 219 1139 –1150 (2018). https://doi.org/10.1007/s11258-018-0865-8 Google Scholar

63. 

R. Makanza et al., “High-throughput phenotyping of canopy cover and senescence in maize field trials using aerial digital canopy imaging,” Remote Sens., 10 (2), 330 (2018). https://doi.org/10.3390/rs10020330 Google Scholar

64. 

T. M. Tate et al., “Evaluation of mesotrione tolerance levels and [C14]mesotrione absorption and translocation in three fine fescue species,” Weed Sci., 67 (5), 497 –503 (2019). https://doi.org/10.1017/wsc.2019.39 WEESA6 0043-1745 Google Scholar

65. 

G. M. Foody, “Local characterization of thematic classification accuracy through spatially constrained confusion matrices,” Int. J. Remote Sens., 26 (6), 1217 –1228 (2005). https://doi.org/10.1080/01431160512331326521 IJSEDK 0143-1161 Google Scholar

66. 

H. G. Lewis and M. Brown, “A generalized confusion matrix for assessing area estimates from remotely sensed data,” Int. J. Remote Sens., 22 (16), 3223 –3235 (2001). https://doi.org/10.1080/01431160152558332 IJSEDK 0143-1161 Google Scholar

67. 

R Core Team, “R: A language and environment for statistical computing,” (2020) https://www.R-project.org/ Google Scholar

68. 

S. D. Wright et al., “Glufosinate safety in WideStrike® Acala cotton,” Weed Technol., 28 (1), 104 –110 (2014). https://doi.org/10.1614/WT-D-13-00039.1 WETEE9 Google Scholar

69. 

W. H. Kruskal and W. A. Wallis, “Use of ranks in one-criterion variance analysis,” J. Am. Stat. Assoc., 47 583 –621 (1952). https://doi.org/10.1080/01621459.1952.10483441 Google Scholar

70. 

O. J. Dunn, “Multiple comparisons using rank sums,” Technometrics, 6 (3), 241 –252 (1964). https://doi.org/10.1080/00401706.1964.10490181 TCMTA2 0040-1706 Google Scholar

71. 

L. Han et al., “Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data,” Plant Methods, 15 10 (2019). https://doi.org/10.1186/s13007-019-0394-z Google Scholar

72. 

M. A. Hassan et al., “Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat,” Remote Sens., 10 (6), 809 (2018). https://doi.org/10.3390/rs10060809 Google Scholar

73. 

M. Schirrmann et al., “Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery,” Remote Sens., 8 (9), 706 (2016). https://doi.org/10.3390/rs8090706 Google Scholar

74. 

P. T. Noi and M. Kappas, “Comparison of random forest, k-nearest neighbor, and support vector machine classifiers for land cover classification using Sentinel-2 imagery,” Sensors (Basel), 18 (1), 18 (2017). https://doi.org/10.3390/s18010018 Google Scholar

75. 

N. A. Otsu, “Threshold selection method from gray-level histograms,” IEEE Trans. Syst. Man. Cybern. Syst., 9 (1), 62 –66 (1979). https://doi.org/10.1109/TSMC.1979.4310076 Google Scholar

76. 

D. I. M. Enderle and R. C. Weih, “Integrating supervised and unsupervised classification methods to develop a more accurate land cover classification,” J. Ark. Acad. Sci., 59 65 –73 (2005). Google Scholar

80. 

H. Fang, S. Liang, “Leaf area index models,” Reference Module in Earth Systems and Environmental Sciences, 2139 –2148 Elsevier, Amsterdam, Netherlands (2008). Google Scholar

81. 

J. D. McCurdy et al., “Effects of mesotrione on perennial ryegrass (Lolium perenne L.) carotenoid concentrations under varying environmental conditions,” J. Agric. Food Chem., 56 (19), 9133 –9139 (2008). https://doi.org/10.1021/jf801574u JAFCAU 0021-8561 Google Scholar

Biography

Isaac Barnhart received his MS degree in agronomy from Kansas State University (KSU), focusing on using unmanned aerial vehicles for cultivar selection in grain sorghum breeding trials. He is currently a PhD student in agronomy (weed science) at Kansas State, focusing on using deep learning object detection algorithms for weed detection in cropping systems.

Sushila Chaudhari is an assistant professor in the Department of Horticulture at Michigan State University. Her program consists of developing weed management programs for edible specialty crops. Before joining MSU, she worked in a joint position with North Carolina State University as a post-doctoral research scholar and with KSU as a visiting scholar. She has authored/co-authored 40 refereed journal articles and more than 35 abstracts and presentations.

Balaji A. Pandian obtained his PhD in agronomy (weed science) from KSU with research on understanding the genetic basis of HPPD-inhibitor resistance in grain sorghum. He is currently a postdoctoral researcher at KSU. His research interests include the genetic and physiological basis of herbicide resistance in crops and weeds.

P. V. Vara Prasad is a university distinguished professor and director of the Sustainable Intensification Innovation Lab at KSU. His research focuses on understanding responses of crops to changing environments; developing best management strategies to improve and protect yields; and improving livelihoods of people and providing food and nutritional security to smallholder farmers. He obtained his BS and MS degrees from Andhra Pradesh Agricultural University (India) and PhD from the University of Reading (United Kingdom).

Ignacio A. Ciampitti received his BS and MS degrees from the University of Buenos Aires, Argentina, and his PhD in agronomy from Purdue University in 2012. Currently, he is an associate professor in the Department of Agronomy and director for the Digital Tools, Geospatial and Farming Systems Consortium, SIIL at KSU. His research focuses on the integration of the main mechanisms linked to plant responses to the environment with remote sensing and modeling.

Mithila Jugulam is currently a professor at KSU with research and teaching responsibilities in weed physiology. She received her PhD and postdoctoral training at the University of Guelph, Canada. Her current research focuses toward understanding the mechanisms and genetic basis of herbicide resistance in weeds. She teaches courses related to herbicide interactions and weed physiology at KSU. She serves as an associate editor of Pest Management Science and Weed Science journals.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Isaac Barnhart, Sushila Chaudhari, Balaji A. A. Pandian, P. V. Vara Prasad, Ignacio A. Ciampitti, and Mithila Jugulam "Use of high-resolution unmanned aerial systems imagery and machine learning to evaluate grain sorghum tolerance to mesotrione," Journal of Applied Remote Sensing 15(1), 014516 (6 March 2021). https://doi.org/10.1117/1.JRS.15.014516
Received: 30 November 2020; Accepted: 22 February 2021; Published: 6 March 2021
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Injuries

Machine learning

Image classification

Image processing

Visualization

Remote sensing

Statistical analysis

RELATED CONTENT


Back to Top