Thermal cameras have been widely used in small Unmanned Aerial Systems (sUAS) recently. In order to analyze a particular object, they can translate thermal energy into visible images and temperatures. The thermal imaging has a great potential in agricultural applications. It can be used for estimating the soil water status, scheduling irrigation, estimating almond trees yields, estimating water stress, evaluating maturity of crops. Their ability to measure the temperature is great, though, there are still some concerns about uncooled thermal cameras. Unstable outdoor environmental factors can cause serious measurement drift during flight missions. Post-processing like mosaicking might further lead to measurement errors. To answer these two fundamental questions, it finished three experiments to research the best practice for thermal images collection. In this paper, the thermal camera models being used are ICI 9640 P-Series, which are commonly used in many study areas. Apogee MI-220 is used as the ground truth. In the first experiment, it tries to figure out how long the thermal camera needs to warm up to be at (or close to) thermal equilibrium in order to produce accurate data. Second, different view angles were set up for thermal camera to figure out if the view angle has any effect on a thermal camera. Third, it attempts to find out that, after the thermal images are processed by Agisoft PhotoScan, if the stitching has any effect on the temperature data.
Thanks to the development of camera technologies, small unmanned aerial systems (sUAS), it is possible to collect aerial images of field with more flexible visit, higher resolution and much lower cost. Furthermore, the performance of objection detection based on deeply trained convolutional neural networks (CNNs) has been improved significantly. In this study, we applied these technologies in the melon production, where high-resolution aerial images were used to count melons in the field and predict the yield. CNN-based object detection framework-Faster R-CNN is applied in the melon classification. Our results showed that sUAS plus CNNs were able to detect melons accurately in the late harvest season.