11 January 2017 Low-cost multispectral imaging for remote sensing of lettuce health
Author Affiliations +
J. of Applied Remote Sensing, 11(1), 016006 (2017). doi:10.1117/1.JRS.11.016006
In agricultural remote sensing, unmanned aerial vehicle (UAV) platforms offer many advantages over conventional satellite and full-scale airborne platforms. One of the most important advantages is their ability to capture high spatial resolution images (1–10 cm) on-demand and at different viewing angles. However, UAV platforms typically rely on the use of multiple cameras, which can be costly and difficult to operate. We present the development of a simple low-cost imaging system for remote sensing of crop health and demonstrate it on lettuce (Lactuca sativa) grown in Hong Kong. To identify the optimal vegetation index, we recorded images of both healthy and unhealthy lettuce, and used them as input in an expectation maximization cluster analysis with a Gaussian mixture model. Results from unsupervised and supervised clustering show that, among four widely used vegetation indices, the blue wide-dynamic range vegetation index is the most accurate. This study shows that it is readily possible to design and build a remote sensing system capable of determining the health status of lettuce at a reasonably low cost ( < US $ 100 ). When combined with recent advances in UAV technology, this system could lead to yield increases for lettuce growers.
Ren, Tripathi, and Li: Low-cost multispectral imaging for remote sensing of lettuce health



To ensure that world food production can rise by 70% by 2050 to match our growing population, it is crucial that advances in crop-monitoring technology be made accessible and affordable to as many farmers as possible.1 Satellite imaging methods have become increasingly sophisticated over the last few decades, but they still tend to suffer from inadequate image resolution, long satellite revisit intervals, and obstructions from the local weather.2 Full-scale airborne imaging via airplanes and helicopters typically rely on multicamera/multispectral systems, which can be costly and difficult to operate, inhibiting their widespread adoption, particularly by farmers in developing countries.3 In this paper, a simple low-cost alternative is proposed in the form of a commercially available digital camera converted for use in remote sensing of crop health. This system works by detecting spectral reflectance changes in crops and using these as input in an expectation maximization (EM) cluster analysis (with a Gaussian mixture model) to generate early warning indicators of crop stress.4 The system is tested on lettuce grown in Hong Kong and is shown to be capable of discerning between healthy and unhealthy lettuce, thus providing lettuce growers with potential access to precision agriculture at a reasonably low cost (<US$100).

Increased crop stress is known to decrease photosynthetic activity in the vegetative canopy.5 Chlorophyll pigments are the main determinants of vegetation reflectance.6 For many green vegetables, a deterioration of plant health leads to reflectance decreases in the near-infrared (NIR) range of the electromagnetic spectrum and to reflectance increases in the visible range.7 Vegetation indices that capture these contrasting NIR and visible wavelengths, such as the normalized difference vegetation index (NDVI), have been successfully used to discern between healthy and unhealthy crops.8 However, surprisingly little work has been done on lettuce (Lactuca sativa), with previous studies having relied on labor-intensive and costly spectroradiometers9 and artificial neural networks,10 or having focused on green coverage rather than health.11

Recent advances in unmanned aerial vehicle (UAV) technology allow for remote sensing of vegetation at high spatial resolutions (1 to 10 cm) throughout much of the growing season.12,13 The high spatial resolution and weather insensitivity of these UAV platforms make them well suited for diagnosing the health of individual plants.14 When equipped with autonomous navigation systems, these UAV platforms can operate independently with minimal pilot or user input, transmitting data back to a central base station for real-time processing and analysis.15,16 This enables ameliorating measures to be customized on a plant-by-plant basis and then deployed at specific times, reducing in-field crop spoilage.17 Similarly, the knowledge that a particular plant is sufficiently healthy prevents over-irrigation and the overspray of fertilizers and pesticides. Thus, with UAV-based platforms, human interactions with the land (i.e., irrigation, fertilization, and pesticides application) would not only be better optimized in terms of volume, frequency, and costs, but it would also be possible to target individual plants rather than the field as a whole.18

In this study, a low-cost imaging system is developed and tested for low-altitude UAV-based remote sensing of lettuce health. The hardware is built around a commercially available digital camera that uses a charge-coupled device to capture crop-reflectance images via an optical filter. Once captured, the images are analyzed using an unsupervised EM clustering algorithm with a Gaussian mixture model. Section 2 describes the data collection process, camera conversion, selection of optical filters, and selection of vegetation indices for detection of lettuce health. Section 3 compares the performance of four vegetation indices at two camera angles (ground-level versus oblique views) and evaluates the accuracy of the clustering algorithm used to discern healthy from unhealthy lettuce. Section 4 summarizes the key findings of this study and provides an outlook for future improvements and potential applications.


Experimental Methodology


Field and Crop Selection

Experiments were conducted in in-vitro conditions at Eco-Park, Tseung Kwan O, Hong Kong from Jan to Jun 2016 (Fig. 1). Lettuce (Lactuca sativa) was selected as the test crop because of its prevalence throughout Asia, its relative ease of growth, and its potential role as a high-yield food source.19 Lettuce was also selected for its ability to showcase the high spatial resolution of the proposed imaging system. This plant has a broad leaf area when viewed from above, with growers tending to maintain a distinct separation between individual lettuce heads.20

Fig. 1

Satellite view of the region around Eco-Park in Hong Kong, with the red circle indicating the precise location of the lettuce plot. The inset shows the position of Eco-Park within Hong Kong. Photo credit: Google Earth.


Lettuce seedlings were germinated for 4 weeks and then planted into a 1.5  m×1  m sandy loam plot at Eco-Park, with a planting density of 9 plants per square meter. The lettuce grew under natural illumination, facing natural weather conditions in an outdoor environment, reproducing the conditions under which real crops grow. The lettuce was subjected to two conditions: optimal healthiness (8 plants) and suboptimal healthiness (11 plants). The former condition, which will be referred to as “healthy,” was achieved with an irrigation level of 100% (i.e., watering every 2 days), fertilized every 10 days with a peanut-based fertilizer containing nitrogen, protein, phosphorous, potassium, and calcium. The latter condition, which will be referred to as “unhealthy,” was achieved with an irrigation level of 50% (i.e., watering every 4 days) without any fertilizer. The first images were collected 120 days after initial planting.


Camera Conversion and Optical Filter Selection

Lettuce reflectance measurements were obtained by converting a commercially available point-and-shoot digital camera (Canon PowerShot A2200: 1/2.3-inch CCD sensor with 4320×3240  pixels) into a multispectral imager sensitive to both visible and NIR light. For agricultural health monitoring, red wavelengths (640 nm) are generally preferred because of (i) their sensitivity to changes in plant health, (ii) the stronger absorption of blue light by chlorophyll and anthocyanin pigments, and (iii) the scattering of blue light by the atmosphere (i.e., Rayleigh scattering), which tends to reduce the signal-to-noise ratio.21

However, even when equipped with a red-pass filter, the red channel of most image sensors is usually contaminated with both red (640 nm) and NIR (700 to 1000 nm) light. This can be resolved with multicamera systems in which one camera is assigned to receive only red light and another is assigned to receive only NIR light. However, for low-cost single-camera setups, such as the one proposed here, a blue-pass filter can be used to separate red light from NIR light. Blue-pass filters are particularly suited for low-altitude imaging because the crop-to-camera distance can be made sufficiently short that signal contamination due to atmospheric Rayleigh scattering becomes negligible. For these reasons, the camera’s built-in infrared-reflecting hot mirror was removed, and a Roscolux #2007 blue-pass filter was installed in its place to separate blue transmission (450 nm) from NIR transmission (700 to 1000 nm). As Fig. 2 shows, this optical filter has low transmittance in the red wavelengths (600 to 700 nm) but high transmittance in the NIR (700 to 1000 nm), and blue (450 to 500 nm) wavelengths. Equipped with these modifications, the red channel of the image sensor becomes an NIR channel, which will be used in Sec. 2.5 to compute various vegetation indices.

Fig. 2

Transmittance of the Roscolux #2007 blue-pass filter, as measured in situ using a USB-650 VIS-NIR Red Tide spectrometer connected to SpectraSuite, both produced by Ocean Optics. A tungsten-filament lamp was used as the white-light source for reference illumination.



White Balance Calibration

Most digital cameras have white balance settings that are optimized for red, green, and blue (RGB) inputs.22 With the hot mirror removed and the Roscolux #2007 blue-pass filter installed, the blue channel now receives both blue and NIR light. To compensate for this, a blue calibration image was taken as the reference of the white balance, so that the camera digitally diminishes the blue channel intensities while increasing the red channel intensities. Calibration of the white balance of the camera was performed with in-situ lighting conditions.


Image Sampling

Spectral images of lettuce were taken under natural illumination, with the camera 0.5 m away from the plant leaf surface. The white balance was recalibrated with each measurement using a blue reference image, as per Sec. 2.3. The compact nature of this imaging system makes it ideal for top-down sweeps with low-altitude UAVs or side-on panoramas conducted on foot or on autonomous vehicles. To assess the sensitivity of this system, both top-down and side-on views were captured of each plant, as shown in Fig. 3. The side-on images were taken with the camera horizontal to the lettuce. The top-down images, representing typical UAV-based measurements, were taken with the camera tilted at 30 deg from the vertical.

Fig. 3

Diagram of the experimental setup showing two different camera angles.



Selection of Vegetation Indices

As explained in Sec. 2.2, the proposed imaging system cannot distinguish between red and NIR light. This restricted our choice of possible candidates for the optimal vegetation index to those that use either blue or green light, as well as NIR light. Four candidate indices were selected from the literature, as shown in Table 1. Typical reflectance spectra for healthy and unhealthy lettuce are shown in Fig. 4. It can be seen that the blue reflectance changes little between health statuses, but that the NIR reflectance changes markedly. One of the most widely used vegetation indices, the NDVI, has in its numerator the intensity difference (NIRRED). Therefore, the NDVI would be large for healthy lettuce but small for unhealthy lettuce, making it a particularly sensitive indicator of plant health.23 By contrast, the blue-normalized difference vegetation index (BNDVI) uses blue light and the green-normalized difference vegetation index (GNDVI) uses green light, which means that neither would be as sensitive as the NDVI. Nevertheless, the NDVI cannot be used here because it requires a pure red channel, which is unavailable from the proposed blue-green imaging system. Consequently, only vegetation indices that use blue or green light were tested (see Table 1).

Table 1

Vegetation indices selected for evaluation on lettuce.

Vegetation indexEquation
Blue-normalized difference vegetation index24NIR−BLUENIR+BLUE
Blue wide-dynamic range vegetation index25αNIR−BLUEαNIR+BLUE
Green chlorophyll index26NIRGREEN−1
Green-normalized difference vegetation index24NIR−GREENNIR+GREEN

Fig. 4

Typical reflectance spectra for healthy and unhealthy lettuce (Lactuca sativa).10



Image Processing

Figure 5 shows the steps in the image processing procedure. The first step was to automatically crop the lettuce from its background so that the calculated vegetation index pertained only to the lettuce itself [Figs. 5(i)5(ii)]. Next, the images were processed through a MATLAB script that calculated the vegetation index of each pixel and produced a spatially averaged index for the whole image [Figs. 5(ii)5(iii)]. These steps were then repeated for each of the four vegetation indices listed in Table 1.

Fig. 5

Steps in the image processing procedure: (i) → (ii) cropping of the lettuce plant and (ii) → (iii) computation of the vegetation index. In this example, BWDRVI values are shown on the colorbar to the right.



Data Analysis: Supervised and Unsupervised Clustering

The vegetation indices computed in Sec. 2.6 were analyzed for plant health using an unsupervised EM clustering algorithm and a Gaussian mixture model.27 This iteratively analyses the input data to find two distinct clusters blindly from a pooled set of healthy and unhealthy plant data.28 With the assumption that the data obeys a normal distribution, Gaussian fits were assigned to the two clusters. The data were used as a training set for the EM Gaussian mixture model. This method is a generalizable procedure that accepts processed images of any given plant and returns a probability of it being healthy, provided that a training set for this particular species exists.28 This is a process that eliminates bias and offers quantitative measures of plant health.9

The vegetation indices were passed into fitgmdist (a built-in function in MATLAB) to produce two Gaussian curves without any knowledge of the health status of each index value. The two curves enable probabilistic assignment of any index value to a cluster, as well as providing a measure of the contrast between healthy and unhealthy lettuce. For evaluation of the clustering quality, the healthy and unhealthy data for each vegetation index were also used in a supervised cluster analysis to manually create two Gaussian curves, providing a known baseline reference against which the unsupervised clustering results could be compared.

In summary, our objective is to determine the most accurate vegetation index for discerning optimally healthy lettuce from suboptimally healthy lettuce using the multispectral images captured with our low-cost camera system (Sec. 2.2). As will be discussed in Sec. 3, by selecting the index that provides the sharpest contrast between healthy and unhealthy lettuce, we were able to create an unsupervised EM Gaussian model for objective diagnosis of lettuce health.


Results and Discussion


Comparison of Vegetation Indices

Comparing the unsupervised and supervised clusters (specifically, their means and variances) and the errors between them provides an objective indicator of the most accurate vegetation index to be paired with the proposed imaging system. This comparison is shown in Figs. 6Fig. 7Fig. 89. Of the four indices tested, the blue wide-dynamic range vegetation index (BWDRVI) is the most capable of discerning healthy from unhealthy lettuce, as indicated by (i) the close match between the unsupervised and supervised clustering curves, at both camera angles [Figs. 7(a) and 7(b) versus Figs. 7(c) and 7(d)]; and (ii) the distinct separation between the Gaussian peaks of the healthy and unhealthy lettuce [Figs. 7(a) and 7(c) versus Figs. 7(b) and 7(d)].

Fig. 6

Supervised and unsupervised clustering of BNDVI values. (a and b) The supervised clustering, (c and d) the unsupervised clustering, and (e and f) a histogram of bootstrapped means. The left column is for the side-on view, and the right column is for the top-down view.


Fig. 7

Supervised and unsupervised clustering of BWDRVI values. (a and b) The supervised clustering, (c and d) the unsupervised clustering, and (e and f) a histogram of bootstrapped means. The left column is for the side-on view, and the right column is for the top-down view.


Fig. 8

Supervised and unsupervised clustering of green chlorophyll index values. (a and b) The supervised clustering, (c and d) the unsupervised clustering, and (e and f) a histogram of bootstrapped means. The left column is for the side-on view, and the right column is for the top-down view.


Fig. 9

Supervised and unsupervised clustering of GNDVI values. (a and b) The supervised clustering, (c and d) the unsupervised clustering, and (e and f) a histogram of bootstrapped means. The left column is for the side-on view, and the right column is for the top-down view.


To further demonstrate the superiority of BWDRVI, we define an accuracy index (AI) that quantifies the ability of a vegetation index to correctly identify any lettuce data point as belonging to either a healthy or an unhealthy cluster:


AINumber of correctly assigned data pointsTotalnumberofdatapoints.

For each of the four vegetation indices (Table 1), AI values were calculated by classifying the data as belonging to either healthy or unhealthy clusters on the basis of the maximum posterior probability.29 In this way, an AI value was generated for each vegetation index, as shown in Fig. 10. It can be seen that images taken from the top-down perspective and processed through BWDRVI produce the most distinct clusters and can therefore be used to determine, with the highest accuracy, the health status of lettuce. Furthermore, when mixing top-down and side-on camera angles, BWDRVI still emerges as the most accurate vegetation index, which is consistent with the observations seen in Fig. 7.

Fig. 10

Accuracy index [Eq. (1)] for the four vegetation indices listed in Table 1, at two different camera angles (side-on and top-down views).



Bootstrap Resampling

A bootstrap resampling analysis was performed with 20,000 resamples in order to gain further statistical confidence on the results. Subfigures (e) and (f) of Figs. 69 show the histograms of the means of the bootstrapped resamples. The close correspondence of the histograms in Figs. 7(e) and 7(f) to their respective Gaussian curves in Figs. 7(c) and 7(d) confirms that BWDRVI is indeed well suited for unsupervised cluster analysis of lettuce health.



A simple low-cost imaging system was developed and tested for low-altitude remote sensing of lettuce health. Built with off-the-shelf components, the system was designed with the strategic aim of increasing the adoption rate of precision agriculture, by making crop-monitoring technology more affordable and accessible, particularly to farmers in developing countries.

Despite its simplicity and low cost, the proposed imaging system was shown to be capable of discerning between healthy and unhealthy lettuce grown in a field plot in Hong Kong. Unsupervised cluster analysis of the BWDRVI produced the most accurate Gaussian curves, as indicated by a high accuracy index of 88.89%. The method outlined here is effective at low altitudes and with different camera viewing angles, making it particularly well suited for deployment on UAV platforms. This study shows that accurate remote sensing of lettuce health can be achieved through relatively simple and affordable means (<US$100). When combined with recent advances in UAV technology, the proposed system could lead to yield increases for lettuce growers.


We would like to thank Animesh Jha, Colman Chan and Gursimran Sethi for their assistance via the Undergraduate Research Opportunities Program (UROP). We would also like to thank Professor Rhea Liem and Mr. Paul Melsom for fruitful discussions.



N. Alexandratos et al., “World agriculture towards 2030/2050: the 2012 revision,” Tech. Rep., ESA Working Paper No. 12-03, Rome, FAO (2012).Google Scholar


A. Matese et al., “Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture,” Remote Sens. 7(3), 2971–2990 (2015).GPRLAJ0094-8276http://dx.doi.org/10.3390/rs70302971Google Scholar


P. Mondal and M. Basu, “Adoption of precision agriculture technologies in India and in some developing countries: scope, present status and strategies,” Prog. Nat. Sci. 19(6), 659–666 (2009).http://dx.doi.org/10.1016/j.pnsc.2008.07.020Google Scholar


I. Colomina and P. Molina, “Unmanned aerial systems for photogrammetry and remote sensing: a review,” ISPRS J. Photogramm. Remote Sens. 92, 79–97 (2014).IRSEE90924-2716http://dx.doi.org/10.1016/j.isprsjprs.2014.02.013Google Scholar


M. Ashraf and P. Harris, “Photosynthesis under stressful environments: an overview,” Photosynthetica 51(2), 163–190 (2013).PHSYB50300-3604http://dx.doi.org/10.1007/s11099-013-0021-6Google Scholar


S. Liaghat et al., “A review: the role of remote sensing in precision agriculture,” Am. J. Agric. Biol. Sci. 5(1), 50–55 (2010).http://dx.doi.org/10.3844/ajabssp.2010.50.55Google Scholar


A. A. Gitelson, Y. Gritz and M. N. Merzlyak, “Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves,” J. Plant Physiol. 160(3), 271–282 (2003).JPPHEY0176-1617http://dx.doi.org/10.1078/0176-1617-00887Google Scholar


P. C. Doraiswamy et al., “Crop yield assessment from remote sensing,” Photogramm. Eng. Remote Sens. 69(6), 665–674 (2003).http://dx.doi.org/10.14358/PERS.69.6.665Google Scholar


Kizil et al., “Lettuce (Lactuca sativa l.) yield prediction under water stress using artificial neural network (ANN) model and vegetation indices,” Zemdirbyste Agric. 99, 409–418 (2012).Google Scholar


H. Gao, H. Mao and X. Zhang, “Inspection of lettuce water stress based on multi-sensor information fusion technology,” in Int. Conf. on Computer and Computing Technologies in Agriculture, pp. 53–60, Springer (2010).Google Scholar


N. Kosaka, S. Miyazaki and U. Inoue, “Vegetable green coverage estimation from an airborne hyperspectral image,” in Geoscience and Remote Sensing Symp. (IGARSS 2002), pp. 1959–1961, IEEE (2002).Google Scholar


R. Dunford et al., “Potential and constraints of unmanned aerial vehicle technology for the characterization of Mediterranean riparian forest,” Int. J. Remote Sens. 30(19), 4915–4935 (2009).IJSEDK0143-1161http://dx.doi.org/10.1080/01431160903023025Google Scholar


J. Bendig et al., “Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley,” Int. J. Appl. Earth Obs. Geoinf. 39, 79–87 (2015).http://dx.doi.org/10.1016/j.jag.2015.02.012Google Scholar


A. Rango et al., “Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management,” J. Appl. Remote Sens. 3(1), 033542 (2009).http://dx.doi.org/10.1117/1.3216822Google Scholar


S. Nebiker et al., “A light-weight multispectral sensor for micro UAV opportunities for very high resolution airborne remote sensing,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 37(B1), 1193–1199 (2008).Google Scholar


E. Salam, C. Barrado and E. Pastor, “UAV flight experiments applied to the remote sensing of vegetated areas,” Remote Sens. 6(11), 11051–11081 (2014).http://dx.doi.org/10.3390/rs61111051Google Scholar


J. Das et al., “Devices, systems, and methods for automated monitoring enabling precision agriculture,” in IEEE Int. Conf. on Automation Science and Engineering (CASE 2015), pp. 462–469, IEEE (2015).Google Scholar


M. Zaman-Allah et al., “Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize,” Plant Methods 11(1), 1 (2015).http://dx.doi.org/10.1186/s13007-015-0078-2Google Scholar


Agriculture, Fisheries and Conservation Department, “Hong Kong: agriculture and fisheries,” Tech. Rep., HKSAR Government (2016).Google Scholar


G. H. James and D. Brainard, “How to grow lettuce,” Tech. Rep., Michigan State University (2011).Google Scholar


R. D. Fiete and T. Tantalo, “Comparison of SNR image quality metrics for remote sensing systems,” Opt. Eng. 40(4), 574–585 (2001).http://dx.doi.org/10.1117/1.1355251Google Scholar


Y.-C. Liu, W.-H. Chan and Y.-Q. Chen, “Automatic white balance for digital still camera,” IEEE Trans. Consum. Electron. 41(3), 460–466 (1995).ITCEDA0098-3063http://dx.doi.org/10.1109/30.468045Google Scholar


B. Govaerts and N. Verhulst, “The normalized difference vegetation index (NDVI) greenseeker (TM) handheld sensor: toward the integrated evaluation of crop management. Part A-concepts and case studies,” CIMMYT, Mexico (2010).Google Scholar


J. Rouse et al., “Monitoring vegetation systems in the great plains with ERTS,” NASA Spec. Publ. 351, 309 (1974).NSSPAW0565-7075Google Scholar


A. A. Gitelson, “Wide dynamic range vegetation index for remote quantification of biophysical characteristics of vegetation,” J. Plant Physiol. 161(2), 165–173 (2004).JPPHEY0176-1617http://dx.doi.org/10.1078/0176-1617-01176Google Scholar


A. A. Gitelson et al., “Remote estimation of canopy chlorophyll content in crops,” Geophys. Res. Lett. 32(8), L08403 (2005).http://dx.doi.org/10.1029/2005GL022688Google Scholar


T. K. Moon, “The expectation-maximization algorithm,” IEEE Signal Process. Mag. 13(6), 47–60 (1996).ISPRE61053-5888http://dx.doi.org/10.1109/79.543975Google Scholar


C. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics), 1st ed., Springer, New York (2007).Google Scholar


M. H. DeGroot, Optimal Statistical Decisions, Vol. 82, John Wiley & Sons, Hoboken, New Jersey (2005).Google Scholar

Biographies for the authors are not available.

David D. W. Ren, Siddhant Tripathi, Larry K. B. Li, "Low-cost multispectral imaging for remote sensing of lettuce health," Journal of Applied Remote Sensing 11(1), 016006 (11 January 2017). http://dx.doi.org/10.1117/1.JRS.11.016006
Submission: Received 8 September 2016; Accepted 13 December 2016


Remote sensing

Multispectral imaging

Near infrared


Unmanned aerial vehicles


Back to Top