8 May 2017 Towards collaboration between unmanned aerial and ground vehicles for precision agriculture
Author Affiliations +
Abstract
This paper presents the work being conducted at Cal Poly Pomona on the collaboration between unmanned aerial and ground vehicles for precision agriculture. The unmanned aerial vehicles (UAVs), equipped with multispectral/hyperspectral cameras and RGB cameras, take images of the crops while flying autonomously. The images are post processed or can be processed onboard. The processed images are used in the detection of unhealthy plants. Aerial data can be used by the UAVs and unmanned ground vehicles (UGVs) for various purposes including care of crops, harvest estimation, etc. The images can also be useful for optimized harvesting by isolating low yielding plants. These vehicles can be operated autonomously with limited or no human intervention, thereby reducing cost and limiting human exposure to agricultural chemicals. The paper discuss the autonomous UAV and UGV platforms used for the research, sensor integration, and experimental testing. Methods for ground truthing the results obtained from the UAVs will be used. The paper will also discuss equipping the UGV with a robotic arm for removing the unhealthy plants and/or weeds.
Bhandari, Raheja, Green, and Do: Towards Collaboration between Unmanned Aerial and Ground Vehicles for Precision Agriculture

1.

INTRODUCTION

Autonomous unmanned vehicles are assuming greater roles for many applications including search & rescue and precision agriculture. Unmanned vehicles are simpler than manned vehicles, pose no threat to human operators, and can be cheaper. UAVs have been used for remote sensing, search & rescue missions, vegetation growth analysis, crop dusting, environmental gas monitoring, traffic monitoring, etc. UGVs are easy to deploy, fast to response, and have a closer and more detailed observation of the environment [1]. Agricultural applications include care of crops and harvesting. UGVs have potential to replace human labor for many agricultural applications.

While UAVs and UGVs are individually advantageous to the agricultural industry and have been used in many applications [2-4], the teaming of the two can be more beneficial for precision agriculture [5, 6]. For example, the UAVs can provide necessary information about the crops such as unhealthy and stressed plants to the UGVs, which can then be used for care of crops such as removal of diseased plants, extraction of tissue samples, water and fertilizer applications, etc. The images can also be useful for optimized harvesting by isolating low yielding plants. These vehicles can be operated autonomously with limited or no human intervention, thereby reducing cost and limiting human exposure to dangerous agricultural chemicals.

The main contribution of the ongoing research at Cal Poly Pomona will be the demonstration of the capability for the collaborative use of UAVs and UGVs for precision agriculture and care of crops. Very limited work exists on the collaboration between UAVs and UGVs for agricultural use. The collaboration between the vehicles can be used for the site-specific management of crops and developing treatment plans as well as for crop stress detection and scouting for disease and insects [7]. UAVs equipped with multispectral/hyperspectral and RGB cameras can take images of crops. These images are then used to understand the crop response to environmental effects such as weather, water, insects, diseases as well as management practices that include irrigation, fertilizers, pesticides, herbicides, nutritional deficiency, etc. These vehicles can be operated autonomously with limited or no human intervention, thereby reducing cost and limiting human exposure to agricultural chemicals.

The rest of the paper is organized as follows. Section 2 presents the UAV platforms and UAV based sensors being used for this research. UGV platforms are discussed in Section 3. Test plot design for the research is discussed in the fourth section. Section 5 explains the methods and sensors used for ground truthing. Some preliminary results are presented in Section 6 followed by the conclusion and future work in the last section.

2.

UAV PLATFORMS AND SENSORS

We have been using or considering several UAV platforms for this research. The UAVs are equipped with multispectral or hyperspectral sensors, RGB cameras, or sprayers. The paragraphs below describe each of these platforms and the sensors.

2.1

DJI S900 Hexacopter

One of the UAV platforms used in this work is a Hexacopter from DJI, which is shown in Figure 1. It has a rotor diameter of 35 inches, empty weight of 7.3 pounds, and maximum takeoff weight of 18 pounds.

Figure 1.

DJI S900 hexacopter equipped with ADC-Lite multispectral camera from Tetracam.

00152_psisdg10218_1021806_page_2_1.jpg

The Hexacopter is equipped with a 3DR Pixhawk autopilot for autonomous flight. An ADC-Lite multispectral camera and HERO5 GoPro camera, shown in Figure 2, are integrated into the Hexacopter. A Canon digital camera is also used for RGB images.

Figure 2.

GoPro Hero5 4K Ultra HD camera (left) and ADC Lite multispectral camera.

00152_psisdg10218_1021806_page_2_2.jpg

As discussed below, the RGB images are used for machine learning classifiers [8, 9], and multispectral camera is used to determine the normalized difference vegetation index (NDVI) and photochemical reflectance index (PRI) [10, 11].

2.2

Lancaster UAV

The second platform that will be used is a Lancaster fixed-wing UAV from PrecisionHawk which is shown in Figure 3. The empty weight of the vehicle is 5.3 pounds with maximum takeoff weight of 7.8 pounds. It has a wing span of 4.9 feet and flies at a cruise speed of 40 to 52 feet/sec for up to 45 minutes.

Figure 3.

Lancaster UAV from PrecisionHawk.

00152_psisdg10218_1021806_page_3_1.jpg

The UAV can be equipped with different plug-and-play sensors such as hyperspectral sensor, multispectral sensor, or an RGB camera. The sensors can easily be swapped in the field without reconfiguration. Figure 4 shows the Nano-Hyperspec hyperspectral sensor from Headwall that the UAV is equipped with.

Figure 4.

Nano-Hyperspec hyperspectral sensor from Headwall.

00152_psisdg10218_1021806_page_3_2.jpg

Compared to multispectral imaging, the higher spectral sensitivity of hyperspectral imaging helps detect the stresses at an earlier stage of development [12, 13]. Multispectral sensors measure reflected energy in 3 to 10 different spectral bands in each pixel of the images. However, hyperspectral sensors measure reflected energy in narrower and more than 200 contiguous spectral bands, and thus provide a continuous spectral measurement. Hyperspectral images have a greater potential to detect stresses due to water and nutrients.

The Nano Hyperspece provides 400-1000 nm of spectral range. It has 640 spatial bands, 270 spectral bands, and frame rate of 300 Hz. It thus provides much better and detailed measurement than the multispectral sensor.

2.3

Aibot X6 UAV

Aibot X6 is a hexacopter from Aibotix. It has an empty weight of 7.5 pounds and can carry payload up to 4.5 pounds. It is 42 inches long and 18 inches high. It comes equipped with inertial measurement unit, GPS receiver, barometer, and magnetometer. The custom built autopilot with AiProFlight software provides autonomous flight capability. Figure 5 shows the UAV equipped with a Nano Hyperspec sensor.

Figure 5.

Aibot X6 multicopter integrated with autopilot and hyperspectral sensor.

00152_psisdg10218_1021806_page_4_1.jpg

The Nano Hyperspec sensor comes with Hyperspec III Software, which is used to sort the pixels and determine plant characteristics and environmental conditions. The data from each wave-length band is assembled into a three-dimensional hyperspectral ‘data cube’ (Hyper Cube) for processing and analysis [14]. Each layer of the cube represents data at a specific wavelength. Multiple display options exist within Hyperspec III software, including raw data display. Post-processing tasks such as airborne orthorectification can also be performed [15, 16] using GPS and INS data.

2.4

DJI AGRAS M-1

The AGRAS M-1 octocopter from DJI can be used for precision variable rate application of liquid pesticides, fertilizers, and herbicides.

Figure 6.

AGRAS MG-1 from DJI.

00152_psisdg10218_1021806_page_4_2.jpg

The vehicle is 58 inches x 58 inches x 19 inches in the arm unfolded configuration without propellers. The propeller diameter is 21 inches. It can carry up to 22 pounds of payload (fertilizers, pesticides), and can cover an area of 1 to 1.5 acre in about 10 minutes. The empty weight of the vehicle without batteries is 20 pounds. The liquid tank can hold a volume of up to 10 liters. Maximum operating speed of the vehicle is 26 feet/second, and can hover for 10 minutes with maximum takeoff weight of 50 pounds. The spray system has 4 nozzles with maximum spray speed of 0.47 L/min per nozzle for water.

3.

UGV PLATFORMS

The UGVs that we are considering for the project are the Husky UGV from Clearpath Robotics and Mastiff HD2-S robot from SuperDroid Robots. The vehicles are shown in Figure 7 and Figure 8, respectively.

Figure 7.

Husky UGV from Clearpath Robotics at CPP’s Spadra Farm.

00152_psisdg10218_1021806_page_5_1.jpg

Figure 8.

Mastiff HD2-S UGV from SuperDroid Robots.

00152_psisdg10218_1021806_page_5_2.jpg

Husky is a medium sized UGV, and can be used for variety of applications. It is equipped with Stereo cameras, PTZ camera, LIDAR, GPS, and IMUs. Its plug-and-play capability allows the integration of a wide range of sensors. It has an external dimensions of 39 x 26.4 x 14.6 inches, empty weight of 110 pounds, and maximum payload of 165 pounds. Maximum speed is 2.3 mph with run time of 3 hours. It comes pre-installed with a mini-ITX with Linux and robot operating system (ROS).

The SuperDroid HD2-S Mastiff Robot with 5 degrees-of-freedom (5-DOF) arm is a large treaded and rugged robot. It can be equipped with many different sensors. The arm can carry maximum payload of 20 pounds. It can run for up to 8 hours with a speed of up to 2.7 mph. The robot’s manipulator can be used to remove stressed plants and weeds. The manipulator has the ability to sense force and pressure. It is equipped with an Adaptive Robot Gripper Hand that is composed of 3 articulated fingers and 4 grasping modes to pick grasp a wide variety of objects of different sizes and shapes. It allows the user to specify the grasping type, the closing speed, and force. The gripper returns information about the gripping state back to the controller.

3.1

Communication between the UAV and UGV

One of the main challenges for multi-vehicle coordination is the communication system among the vehicles and between the vehicles and Ground Control Station [1]. The communication subsystem is one of the most critical components of unmanned system operation. Also, as the UAVs are limited by size, power, and weight, productive solutions that increase range and reliability while not adding additional weight, complexity, and power consumption are required. The link must support transfer of both high bandwidth surveillance data and command/control messages [17].

The communication system can include cloverleaf or circularly polorized antennas for telemetry, data, and video/image transmission at 900 MHz and 5.8 GHz, respectively. XBee Pro 900 MHz radio is being used for the communication between autopilot and the ground station. The radios transmit the required bandwidth over a long distance. Figure 9 shows the antennas and radios used [1].

Figure 9.

Communication system hardware.

00152_psisdg10218_1021806_page_6_1.jpg

4.

TEST PLOT

An experiment has been designed to grow lettuce in a small controlled test plot [18]. The test plot has total of 24 rows, and each row is 300 feet long. Each row is divided into three segments of 90 feet each, with 15-feet gap between the segments. The rows of lettuce are designed for different levels of nitrogen and water treatments, fixing the level of water treatment and varying the level of nitrogen treatments or vice versa. For example, Row 1 is subject to 100% water, Row 3 is subject to 50% water, and Row 5 is subject to 25% water while keeping the level of nitrogen constant. No nitrogen treatment will be applied to the even rows (Row 2, Row 4, etc.) and will not be used for data to avoid the error resulting from leaching.

Lettuce data are collected from all the segments of each row during each measurement date. This subsampling will help reduce the error as the results from different segments will be averaged. Figure 10 shows the test plot with lettuce being grown for this research.

Figure 10.

Lettuce test plot at Cal Poly Pomona’s Spadra Farm.

00152_psisdg10218_1021806_page_7_1.jpg

Prior to beginning the study, soil moisture (volumetric soil water content) is determined using a TDR Soil Moisture Meter [19]. Also, the soil nitrogen content is determined by sending the soil samples for laboratory test. Soil fill capacity (soil moisture release curve) is also determined by sending the soil to a laboratory. Figure 11 shows the TDR soil moisture meter being used.

Figure 11.

TDR Soil Moisture Meter.

00152_psisdg10218_1021806_page_7_2.jpg

5.

GROUND TRUTHING

Ground truthing is important for the verification of the remote sensing data in determining the plant stresses due to water and nutrient deficiency. This requires sufficient validation of the remote sensing data with the ground truth data. We are using several methods for ground truthing [20]. The collection of remote sensing data and ground truth data is done almost simultaneously to reduce the error between different data sets. For ground truthing, we are using handheld spectroradiometer, chlorophyll content meter, and leaf water potential meter [21].

The handheld spectroradiometer we are using is a visible-near infrared (VNIR) device from ASD, shown in Figure 12. It can provide spectral data in 325-1070 nm spectral range. The data obtained using this device provides a great source of data for ground truthing as it is also a hyperspectral sensor.

Figure 12.

VNIR Handheld Spectroradiometer.

00152_psisdg10218_1021806_page_8_1.jpg

The chlorophyll content meter we are using is a SPAD 502DL Plus Chlorophyll Meter from Spectrum Technology [22, 23], which is shown in Figure 13. It instantly (in less than 2 seconds) measures chlorophyll content, and can detect subtle changes or trends in plant health long before they’re visible to the human eye. It can provide an excellent information on leaf nitrogen content [24].

Figure 13.

SPAD 4 Chlorophyll meter.

00152_psisdg10218_1021806_page_8_2.jpg

Figure 14 shows the WPC4 Water Potential Meter that we are using for measuring leaf water potential. In order to use the water potential meter, samples of lettuce leaves are brought to a Lab. It accurately measures water potential in a short amount of time. It measures leaf water potential by determining the relative humidity of the air above a sample in a closed chamber.

Figure 14.

WPC4 Water Potential Meter from Decagon.

00152_psisdg10218_1021806_page_9_1.jpg

6.

RESULTS AND EVALUATION

The figure below shows the DJI 900 hexacopter being flown over the lettuce plot for multispectral and RGB data collection.

Figure 15.

UAV flight over the lettuce plot.

00152_psisdg10218_1021806_page_9_2.jpg

Figure 16 below shows the NDVI (normalized difference vegetation index) image of the lettuce field provided by the Tetracam multispectral camera [10] using the PixelWrench2 software. The values on the right are the NDVI values calculated according to the following formula:

00152_psisdg10218_1021806_page_9_3.jpg

Figure 16.

NDVI image of the lettuce plot.

00152_psisdg10218_1021806_page_10_1.jpg

where NIR and RED are reflectances in the NIR (near infrared) and Red spectrums. NDVI ratio ranges from -1 to 1. Higher positive NDVI values indicate healthy plants whereas lower values indicate unhealthy plants and negative values indicate unhealthy plants or no vegetation. The aerial NDVI data from UAV is being verified using the handheld spectrometer as discussed above. Figure 17 below shows the reflectance plot of a lettuce plant using the handheld spectrometer for two different plants. Based on the formula above, the NDVI value for the left plot is 0.92, indicating a very healthy plant where as the NDVI for the right is 0.56, indicating a less healthy plant.

Figure 17.

Reflectance plot of a healthy lettuce plant (left) and less healthy plant (right).

00152_psisdg10218_1021806_page_10_2.jpg

We have also created an algorithm that can automatically localize a lettuce plant in an RGB image as shown in Figure 18. We use the algorithm to crop the image down to the plant in focus. This prevents noise from the surrounding plants from being fed to our machine learning algorithm.

Figure 18.

Automatically localized lettuce image using image processing.

00152_psisdg10218_1021806_page_11_1.jpg

The machine learning algorithm we are developing uses a convolutional neural network (CNN) that takes an image of a plant as input and creates a real-valued output representing the estimated NDVI of the plant in the image. For the convolutional layers of algorithm, pre-trained convolutional layers from InceptionV3 are utilized [25]. The activations are run from the convolutional layers through two FC blocks, where an FC block is composed of a fully connected layer with 512 nodes, a batch normalization layer [26], and a dropout layer with p = 0.5 [27]. After these FC blocks, we have a final fully connected layer with a single node that represents our output NDVI.

After training the algorithm, we utilize finetuning i.e., only the additional layers that were appended to the convolutional layers are trained. These layers are trained using the Adam optimizer with a learning rate of 0.01 [28].

Our training data consists of an image of a plant and the plant’s associated NDVI values obtained the handheld spectroradiometer. In order to reduce overfitting, we use data augmentation consisting of horizontal flipping and X and Y axis shifting. Figure 19 shows a preliminary result on the comparison between the predicted NDVI and the NDVI values obtained using the handheld spectrometer. It is seen that there is a large error present in some instances. This is attributed to the number of data samples available for the training. With the collection of more data, this error is expected to reduce.

Figure 19.

Measured vs predicted NDVI.

00152_psisdg10218_1021806_page_12_1.jpg

7.

CONCLUSION AND FUTURE WORK

An architecture for collaboration between unmanned aerial and ground vehicles is being designed for precision agriculture and care of crops. Both the UAV and UGV platforms and sensors have been identified for the project. A test plot has been designed and lettuce is grown in the plot, which is subject to or will be subjected to varied rates of water and nitrogen treatments. The goal is to determine the plant stresses due to water and nitrogen deficiency. Multispectral and RGB images are being collected from a multicopter and fixed-wing UAVs. Multispectral images are used to calculate normalized differential vegetation index (NDVI) that provides information on plant nitrogen content. The RGB images are being used for the development of machine learning classifiers that will predict the quality of the plants using RGB images from UAVs. Handheld spectrometer, chlorophyll meter, and leaf water potential meter are being used for ground truthing.

Future work will involve collecting hyperspectral images from the UAVs. We will develop a system for isolating the areas of the lettuce field that need more attention or are poorly performing than others. Treatment plan will be developed accordingly. This will be helpful in the site-specific nitrogen and water management.

Once the individual UAV and UGV are tested, the collaboration between the vehicles will be tested for precision agriculture and care of crops. Also, multi-vehicle collaboration will be used for automated stress detection and crop dusting. One UAV will detect the crop stresses, and send that information to the second UAV as well as the UGV, and the vehicles will collaboratively execute the crop dusting mission.

ACKNOWLEDGEMENT

This project is supported by the California State University Agricultural Research Institute Grant number 17-04-235. We would like to thank Dr. David Still, Dr. Reza Chaichi, and Prof. Mon Yee in the Plant Sciences Department for their valuable inputs and expert opinions, Antonio Espinas in the Apparel Merchandizing Department for help with the project management and data collection, and Farm Coordinator Adam Mason for help with the preparation of the lettuce plot.

REFERENCES

[1] 

Bhandari, S., Tang, D., Boskovich, S., Zekeriya, A., Demonteverde, R., et al., “Collaboration between Multiple Unmanned Vehicles for Increased Mission Efficiency,” Proceedings of Infotech@Aerospace Conference, San Diego, CA, (2016).Google Scholar

[2] 

Aguera, F., Carvajal, F., et al., “Measuring sunflower nitrogen status from an unmanned aerial vehicle-based system and an on the ground device,” Conference on Unmanned Aerial Vehicle in Geomatics, UAV-g-2011, Zurich, Switzerland, (2011).Google Scholar

[3] 

Herwitz, S.R., Johnson, L.F., et al., “Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support,” Computers and Electronics in Agriculture, vol. 44, pp. 49–61 (2004).Google Scholar

[4] 

Abuleli, A.M., Taylor, G.W., and Moussa, M., “An integrated system for mapping red clover ground cover using unmanned aerial vehicles: A Case Study in Precision Agriculture,” IEEE Conference on Computer and Robot Vision, Halifax, Nova Scotia, (2015).Google Scholar

[5] 

Tokekar, P., Hook, J. V., et al., “Sensor Planning for a Symbiotic UAV and UGV System for Precision Agriculture,” IEEE Transactions on Robotics, Vol. 32, No. 6, (2016).Google Scholar

[6] 

Vasudevan, A., Kuma, A., and Bhuvaneswari, N. S., “Precision farming using unmanned aerial and ground vehicles,” IEEE International Conference on Technological Innovations in ICT for Agriculture and Rural Development,Google Scholar

[7] 

Casady, W.W., “Precision Agriculture: Remote Sensing and Ground Truthing,” MU Extension, University of Missouri-Columbia.Google Scholar

[8] 

Dimitriadis, S. and Goumopoulos, C., “Applying machine learning to extract new knowledge in precision agriculture Applications,” IEEE Panhellenic Conference on Informatics, Samos, Greece, 28-30 Aug. (2008).Google Scholar

[9] 

Ding, K., Raheja, A., and Bhandari, S., “Application of machine learning for the evaluation of turfgrass plots using aerial images,” Proceedings of SPIE Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping Conference, Baltimore, MD, (2016).Google Scholar

[10] 

Sona, G., Passoni, D., et al., “UAV multispectral survey to map soil and crop for precision farming applications,” XXIII ISPRS Congress, Prague, Czech Republic, (2016).Google Scholar

[11] 

Rascher, U., Nichols, C. J., and Small, C., “Hyperspectral imaging of photosynthesis from the single leaf to the complex canopy – understanding the spatio-temporal variations of photosynthesis within a drought-stressed tropical canopy,” Imaging Spectroscopy: New Quality in Environmental Studies, pp. 709–719, (2006).Google Scholar

[12] 

Franke, J., Mewes, T., and Menz, G., “Airborne hyperspectral imaging for the detection of powdery mildew in wheat, proceedings of SPIE optics and photonics, Vol. 7086, San Diego, (2008).Google Scholar

[13] 

Govender, M., Chetty, K. et al., “A comparison of satellite hyperspectral and multispectral remote sensing imagery for improved classification and mapping of vegetation, Water SA, Vol. 34, No. 2, pp. 147–154, (2008).Google Scholar

[15] 

Habib, A., Xiong, W., et al., “Improving orthorectification of UAV-based push-broom scanner imagery using derived orthophotos from frame cameras, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol. 10, No. 1, (2016).Google Scholar

[16] 

Habib, A., Han, Y., et al., “Automated ortho-rectification of UAV-based hyperspectral data over an agricultural field using frame RGB imagery,” Remote Sensing, Vol. 8, No. 10, (2016).Google Scholar

[17] 

Heid, M., Bettadapura, A., Ito, E., Bhandari, S., and Tang, D., “A ground control station for multivehicular control and data visualization,” Proceedings of AIAA Infotech@Aerospace Conference, Kissimmee, FL, (2015).Google Scholar

[18] 

Lelong, C.C.D., Burger, P., et al., “Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots,” Sensors, Vol. 8, pp. 3557–3585, (2008).Google Scholar

[19] 

Al-Jabri, S. A., Lee, J., et al., “A dripper-TDR method for in situ determination of hydraulic conductivity and chemical transport properties of surface soils,” Advances in Water Resources, Vol. 29, No. 2, pp. 239–249, (2006).Google Scholar

[20] 

Congalton, R.G. and Green, K., “Assessing the accuracy of remotely sensed data: principles and practices,” Taylor and Francis: Boca Raton, FL, USA, (2009).Google Scholar

[21] 

Govendor, M., Dye, P.J., et al., “Review of commonly used remote sensing and ground-based technologies to measure plant water stress,” Review, Water SA, Vol. 35, No. 5, pp. 741–752, (2009).Google Scholar

[22] 

Lindsey, A. J., Steinke, K., et al., “Relationship between DGCI and SPAD values to corn grain yield in the eastern corn belt,” Crop, Forage, and Turf Management (2016).Google Scholar

[23] 

Hunt, E. R., and Daughtry, C. S. T., “Chlorophyll meter calibrations for chlorophyll content using measured and simulated leaf transmittances,” Agronomy Journal, Vol. 106, No. 3, pp. 931–939, (2014).Google Scholar

[24] 

Fox, R. H. and Walthall, C. L., “Crop monitoring technology to assess nitrogen status,” Nitrogen in Agricultural Systems, American Society of Agronomy, Agronomy Monograph No. 49, pp. 647–674, (2008).Google Scholar

[25] 

Ioffe, S. and Szegedy, C., “Batch normalization: accelerating deep network training by reducing internal covariate shift,” Proceedings of the 32nd International Conference on Machine Learning, Lille, France, (2015).Google Scholar

[26] 

Kingma, D. and Ba, J., “Adam: a method for stochastic optimization,” International Conference on Learning Representations, (2015).Google Scholar

[27] 

Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., amd Salakhutdinov, R., “Dropout: a simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, Vol 15, pp 1929–1958, (2014).Google Scholar

[28] 

Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., and Wojna, Z., “Rethinking the inception architecture for computer vision,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2016).Google Scholar

© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Subodh Bhandari, Amar Raheja, Robert L. Green, Dat Do, "Towards collaboration between unmanned aerial and ground vehicles for precision agriculture", Proc. SPIE 10218, Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II, 1021806 (8 May 2017); doi: 10.1117/12.2262049; http://dx.doi.org/10.1117/12.2262049
PROCEEDINGS
14 PAGES


SHARE
KEYWORDS
Unmanned aerial vehicles

Sensors

Agriculture

Cameras

RGB color model

Nitrogen

Soil science

Back to Top