The NGA has initiated a program to evaluate high resolution commercial sensors developed by Space Imaging, DigitalGlobe, Orbimage and others. Recent evaluations have involved QuickBird panchromatic products including the Basic 1B, Ortho-Ready Standard 2A, orthorectified, and Basic Stereo Pair products. This paper presents the results of an additional geo-positional accuracy evaluation of multispectral QuickBird Ortho-Ready Standard 2A images. These products were compared to a globally distributed set of Ground Control Points (GCPs), and the calculated geo-positional accuracy was compared to the published QuickBird specifications and to the panchromatic Ortho-Ready Standard 2A product. The results of the panchromatic and multispectral bands are compared, and the band-to-band registration of the multispectral arrays is evaluated.
It is extremely difficult and expensive to determine the flight attitude and aimpoint of small maneuvering miniature air vehicles from ground based fixed or tracking photography. Telemetry alone cannot provide sufficient information bandwidth on 'what' the ground tracking is seeing and consequently 'why' it did or did not function properly. Additionally, it is anticipated that 'smart' and 'brilliant' guided vehicles now in development will require a high resolution imaging support system to determine which target and which part of a ground feature is being used for navigation or targeting. Other requirements include support of sub-component separation from developmental supersonic vehicles, where the clean separation from the container is not determinable from ground based film systems and film cameras do not survive vehicle breakup and impact. Hence, the requirement is to develop and demonstrate an imaging support system for development/testing that can provide the flight vehicle developer/analyst with imagery (combined with miniature telemetry sources) sufficient to recreate the trajectory, terminal navigation, and flight termination events. This project is a development and demonstration of a real-time, launch-rated, shuttered, electronic imager, transmitter, and analysis system. This effort demonstrated boresighted imagery from inside small flight vehicles for post flight analysis of trajectory, and capture of ground imagery during random triggered vehicle functions. The initial studies for this capability have been accomplished by the Experimental Dynamics Section of the Air Force Wright Laboratory, Armament Directorate, Eglin AFB, Florida, and the Telemetry Support Branch of the Army Material Research and Development Center at Picatinny Arsenal, New Jersey. It has been determined that at 1/10,000 of a second exposure time, new ultra-miniature CCD sensors have sufficient sensitivity to image key ground target features without blur, thereby providing data for trajectory, timing, and advanced sensor development. This system will be used for ground tracking data reduction in support of small air vehicle and munition testing. It will provide a means of integrating the imagery and telemetry data from the item with ground based photographic support. The technique we have designed will exploit off-the-shelf software and analysis components. A differential GPS survey instrument will establish a photogrammetric calibration grid throughout the range and reference targets along the flight path. Images from the on-board sensor will be used to calibrate the ortho- rectification model in the analysis software. The projectile images will be transmitted and recorded on several tape recorders to insure complete capture of each video field. The images will be combined with a non-linear video editor into a time-correlated record. Each correlated video field will be written to video disk. The files will be converted to DMA compatible format and then analyzed for determination of the projectile altitude, attitude and position in space. The resulting data file will be used to create a photomosaic of the ground the projectile flew over and the targets it saw. The data will be then transformed to a trajectory file and used to generate a graphic overlay that will merge digital photo data of the range with actual images captured. The plan is to superimpose the flight path of the projectile, the path of the weapons aimpoint, and annotation of each internal sequence event. With tools used to produce state-of-the-art computer graphics, we now think it will be possible to reconstruct the test event from the viewpoint of the warhead, the target, and a 'God's-Eye' view looking over the shoulder of the projectile.
The overall accuracy of digital landcover products can often be improved through the use of fused imagery products generated by good cross-sensor resolution enhancement algorithms. This paper describes a process for fusing medium resolution multi-spectral data, such as Landsat and SPOT, with National Aerial Photographic Program (NAPP) photographs. The NAPP has a goal of providing coverage of the 48 contiguous United States every 10 years at high spatial resolution [i.e., 2 meter ground resolving distance (GRD)]. NAPP and National High Altitude Photography (NHAP) provide a wealth of current and historic high resolution data for environmental and natural resource studies. Despite their comprehensive coverage and high spatial resolution, these images are often overlooked for use in large-scale computerized classification problems, because: (1) they are photographic 'analog' data stored on film, not digital data on magnetic media; (2) comprehensive support data (e.g., aircraft x, y, z, roll, pitch, and yaw) is lacking; (3) they are not geocoded or orthorectified and random aircraft motion combined with sensor projection make it difficult to georegister; and (4) their radiometric quality varies both within and between images. This paper describes a technique for merging NAPP/NHAP data with lower resolution satellite data such as Landsat and SPOT which results in a fused image product that has the high spatial resolution of the NAPP/NHAP data and the spectral quality of the satellite data. The technique permits the user to utilize this higher resolution data to improve the quality and accuracy of their landcover, change detection, stress analysis, or other remote sensing products. Specific published results show an improvement in the overall accuracy from 79.4% correct classification using Landsat TM (25 meter GSD) alone to over 94.2% correct classification using higher resolution (5 meter GSD) data. We also discuss our future plans related to these techniques and their applications.