Flight data recorders (FDRs) play a critical role in determining root causes of military aviation mishaps. Some United States Air Force (USAF) aircraft record limited amounts of information during flight (e.g. T-1 Jayhawk), while others have no FDR on board the aircraft (B-52 Stratofortress). This study explores the use of image-based flight data recording to overcome a lack of available digitally-recorded FDR data. In this work, images of simulated cockpit gauges were unwrapped vertically, and 2-D cross-correlation was performed on each image of the unwrapped gauge and an template of the unwrapped gauge needle. Points of high correlation between the two images were used to locate the gauge needle, and interpolation and extrapolation were performed (based on known pixel locations of gauge tick marks) to quantify the value to which the gauge needle pointed. Results suggest that image-based flight data recording could provide key support to USAF mishap investigations when aircraft lack sufficient FDR data.
Because unmanned air vehicles (UAVs) are emerging as an indispensable image acquisition platform in precision agriculture, it is vitally important that researchers understand how to optimize UAV camera payloads for analysis of surveyed areas. In this study, imagery captured by a Nikon RGB camera attached to a Precision Hawk Lancaster was used to survey an agricultural field from six different altitudes ranging from 45.72 m (150 ft.) to 121.92 m (400 ft.). After collecting imagery, two different software packages (MeshLab and AgiSoft) were used to measure predetermined reference objects within six three-dimensional (3-D) point clouds (one per altitude scenario). In-silico measurements were then compared to actual reference object measurements, as recorded with a tape measure. Deviations of in-silico measurements from actual measurements were recorded as Δx, Δy, and Δz. The average measurement deviation in each coordinate direction was then calculated for each of the six flight scenarios. Results from MeshLab vs. AgiSoft offered insight into the effectiveness of GPS-defined point cloud scaling in comparison to user-defined point cloud scaling. In three of the six flight scenarios flown, MeshLab's 3D imaging software (user-defined scale) was able to measure object dimensions from 50.8 to 76.2 cm (20-30 inches) with greater than 93% accuracy. The largest average deviation in any flight scenario from actual measurements was 14.77 cm (5.82 in.). Analysis of the point clouds in AgiSoft (GPS-defined scale) yielded even smaller Δx, Δy, and Δz than the MeshLab measurements in over 75% of the flight scenarios. The precisions of these results are satisfactory in a wide variety of precision agriculture applications focused on differentiating and identifying objects using remote imagery.