The temperature of the plant canopy is closely related with its transpirative status, and therefore, its stomatal conductance and cooling capacity. Ear and leaf temperature can provide useful information for monitoring crop water status, irrigation management and yield assessment. Previous studies have shown differences in temperature between ears and leaves, with higher ear temperatures than leaf temperatures observed. By employing a high resolution thermal radiometric camera for proximal imaging, temperature differences can be used for segmentation as well as for temperature estimation. This work uses thermal images taken from above the canopy at between 0.8 and 1m distance. Measurements were acquired after solar noon. The field trials were carried out in three experimental sites and two crop seasons in Spain: Aranjuez (2016/2017), Sevilla (2015/2016) and Valladolid (2016/2017). A set of 24 varieties of durum wheat in two growing conditions, irrigated and rainfed, were used to build the thermal imagery database. The algorithm uses a pipeline system to filter the low temperatures and enhance the local contrast in order to segment the ear regions in each thermal image. Finally, using the full thermal radiometric information, the algorithms provide the temperature for each ear automatically detected. The results show high correlation values between the ear temperatures manually measured (using the thermal camera software) and the ear temperatures automatically measured using an automatic image processing pipeline.
The number of ears per unit ground area, or ear density, is in most cases the main agronomic yield component of wheat. A fast evaluation of this attribute may contribute to crop monitoring and improve the efficiency of crop management practices as well as breeding programs. Currently, the number of ears is counted manually, which is time consuming. This work uses zenithal RGB images taken from above the crop canopy in natural light and field conditions. Wheat trials were carried out in two sites (Aranjuez and Valladolid, Spain) during the 2014/2015 crop season. A set of 24 varieties of durum wheat in two growing conditions with three dates of measurement were used to create the image database. The algorithm for ear counting uses three steps: (i) Laplacian frequency filter (ii) median filter (iii) Find Maxima. Although the image database was collected at the ground level, we have simulated images at lower resolutions in order to test potential application from cameras with lower resolution, such mobiles phones, action cameras (5 – 12 megapixels), or even aerial platforms (e.g. UAV from 25-50 meters). Images were resized to five different resolutions with no interpolation techniques applied. The results demonstrate high accuracy between the algorithm counts and the manual (image-based) ear counts, higher than 90% in success rate, with a decrease of <1% when images were reduced to a half of its original size, and success rates decreasing by 2.29%, 7.32%, 17.32% and 38.82% for images resized by four, eight, 16 and 32 values, respectively.
Canopy cover is an important agronomical component for determining grain yield in cereals. Estimates of the canopy cover area of crops may contribute to improving the efficiency of crop management practices and breeding programs. Conventional high resolution RGB cameras can be used to acquire zenithal images taken at ground level or from a UAV (Unmanned Aerial Vehicle). Canopy-image segmentation is complicated in field conditions by numerous factors, including soil, shadows and unexpected objects. Spatial resolution is a key factor for estimating canopy cover area because low spatial resolution may introduce artifacts in the digital image. We propose a comparison of canopy cover segmentation using different spatial resolutions to test the scalability potential of these different techniques. Field trials were carried out during the 2015/2016 crop season in the Arazuri experimental station of INTIA in Navarra, Spain. Three barley genotypes, 10 different N fertilization regimens and three replicates were used in this study. This work uses zenithal RGB images taken from 1 m above the crop and images from the UAV were taken at the intervals of 2 s the during of the flight at distances of 25, 50 and 100 m. Images from the ground were taken at 1 m above the canopy. The CerealScanner plugin for FIJI (Fiji is Just ImageJ) was used to calculate the BreedPix RGB vegetation indices. The comparative results demonstrate the algorithm’s effectiveness in scaling through high correlation values between images with different spatial resolutions taken from the UAV and images taken from the ground.
Extreme and abnormal weather events, as well as the more gradual meteorological changes associated with climate change, often coincide with not only increased abiotic risks (such as increases in temperature and decreases in precipitation), but also increased biotic risks due to environmental conditions that favor the rapid spread of crop pests and diseases. Durum wheat is by extension the most cultivated cereal in the south and east margins of the Mediterranean Basin. It is of strategic importance for Mediterranean agriculture to develop new varieties of durum wheat with greater production potential, better adaptation to increasingly adverse environmental conditions (drought) and better grain quality. Similarly, maize is the top staple crop for low-income populations in Sub-Saharan Africa and is currently suffering from the appearance of new diseases, which, together with increased abiotic stresses from climate change, are challenging the very sustainability of African societies. Current constraints in field phenotyping remain a major bottleneck for future breeding advances, but RGB-based High-Throughput Phenotyping Platforms (HTPPs) have shown promise for rapidly developing both disease-resistant and weather-resilient crops. RGB cameras have proven costeffective in studies assessing the effect of abiotic stresses, but have yet to be fully exploited to phenotype disease resistance. Recent analyses of durum wheat in Spain have shown RGB vegetation indexes to outperform multispectral indexes such as NDVI consistently in disease and yield prediction. Towards HTTP development for breeding maize disease resistance, some of the same RGB picture vegetation indexes outperformed NDVI (Normalized Difference Vegetation Index), with R<sup>2</sup> values up to 0.65, compared to 0.56 for NDVI. . Specifically, hue, a*, u*, and Green Area (GA), as produced by FIJI and BreedPix open source software, performed similar to or better than NDVI in predicting yield and disease severity conditions for wheat and maize. Results using UAVs (Unmanned Aerial Vehicles) have produced similar results demonstrating the robust strengths, and limitations, of the more cost-effective RGB picture indexes.