Limiting resolution is a simple metric that describes the ability of any image system to distinguish small details of an object. Limiting resolution is normally subjectively calculated from the smallest resolvable group and element in a resolution target such as the USAF 1951 target or analytically from the modulation transfer function MTF of the system. Although limiting resolution has limitations, it provides a quick method with low complexity to establish the performance of an imaging system. Various factors affect limiting resolution such as the optical performance of the system and sensor noise, both temporally and spatially. Evaluating the resolution performance of full motion video FMV results in uncertainty in limiting resolution due to the temporal variation of the system. In high performance FMV system where the modulation associated with the limiting resolution is small, the limiting resolution can vary greatly frame to frame. This paper explores how limiting resolution is measured, factors that affect its uncertainty in FMV system, and provides real world examples from airborne video.
This paper examines the measurement of MTF of slant edge targets from airborne imagery. The MTF is calculated by extracting the edge spread function from the slant edge, deriving the line spread function, the performing an FFT to get the MTF. Because characteristics of airborne imagery are not controlled, using edge targets to get the system level MTF present challenges. A method to calculate the MTF from edge targets in airborne imagery is proposed by normalizing the scan lines in the edge spread function and low pass filtering it. An example using air borne imagery is shown and compared with analytical results and laboratory measurements. The paper also examines extracting the effects on the MTF due to image blur from jitter common with air borne imagery.
Lenses for staring-array point-source detection sensors must maintain good signal-to-noise ratio (SNR) over fields of view often exceeding 100 degrees. Such lenses typically have f-θ distortion to provide constant solid angle sampling in object space. While the relative illumination calculation is often used to describe flux transfer from a Lambertian extended object for imaging applications, maximizing SNR for point-source detection depends primarily on maximizing collected irradiance at the entrance pupil, the shape of which can vary dramatically over field. We illustrate this field-dependent SNR calculation with an example lens and outline the calculations needed to derive a simple aberration-based expression for the field dependence of point-source SNR.
Long-range airborne full-motion-video systems require large apertures to maximize multiple aspects of system
performance, including spatial resolution and sensitivity. As systems push to larger apertures for increased resolution
and standoff range, both mounting constraints and atmospheric effects limit their effectiveness. This paper considers two
questions: first, under what atmospheric and spectral conditions does it make sense to have a larger aperture; second,
what types of optical systems can best exploit movement-constrained mounting? We briefly explore high-level
atmospheric considerations in determining sensor aperture size for various spectral bands, following with a comparison
of the swept-volume-to-aperture ratio of Ritchey-Chrétien and three-mirror-anastigmat optical systems.
As airborne EOIR imaging systems strive to achieve high-NIIRS full motion video (FMV) from longer and longer standoff ranges, the challenges behind conceptualizing, designing, and fielding such systems grows significantly. We present a heuristic framework for dissecting the "goodness" of an FMV multispectral sensor and look at the various components behind what makes a high-resolution sensor. Combining spatial, temporal, spectral, and "signal" resolution with system footprint size/weight/power (SWaP) metrics allows deterministic tradeoffs between optical systems as well as system architectures. We present example trade studies of optical architectures from disparate application fields in various SWaP-constrained environments for long-range imaging and evaluating how system parameters are intrinsically linked.
Performance models for infrared imaging systems require image quality parameters; optical design engineers need image quality design goals; systems engineers develop image quality allocations to test imaging systems against. It is a challenge to maintain consistency and traceability amongst the various expressions of image quality. We present a method and parametric tool for generating and managing expressions of image quality during the system modeling, requirements specification, design, and testing phases of an imaging system design and development project.
While the use of optics in the playback of music has been a tremendously successful technology and laser light shows are
a common occurrence, other intersections of optics and music tend to be less well known. Topics such as optics-based
instruments, performance tools and effects, instrument characterization and manufacturing, recording, playback, and
signal processing are explored.
Optical systems designed for some defense, environmental, and commercial remote-sensing applications must simultaneously have a high dynamic range, high sensitivity, and low noise-equivalent contrast. We have adapted James Janesick’s photon transfer technique for characterizing the noise performance of an electron multiplication CCD (EMCCD), and we have developed methods for characterizing performance parameters in a lab environment. We have
defined a new figure of merit to complement the traditionally used dynamic range that quantifies the usefulness of EMCCD imagers. We use the results for EMCCDs to predict their performance with hyperspectral and multispectral imaging systems.
Airborne surveillance presents challenging target-detection opportunities for optical remote sensors, especially under the constraints of size, weight, and power imposed by small aircraft. We present a spatial-frequency dependent figure-of-merit, called the Detector Quantum Efficiency (DQE), by first tracing its origins in single pixel photon multiplication detectors, where it is shown to be yield (quantum efficiency or QE) divided by the noise factor. We then show the relationship of DQE to several well-known figures-of-merit. Finally we broaden the definition of DQE to include the spatial-frequency dependence on the MTF of the system and the noise power spectrum (NPS) of the detector. We then present the results of the application of this DQE to a hyperspectral camera under development at BAE Systems Spectral Solutions LLC.
Science and Technology International (STI) has developed a six-band multispectral imager optimized for surf-zone reconnaissance and mine countermeasures (MCM). Airborne surf-zone MCM requires both accurate spectral imaging and high spatial resolution. Vibration and aircraft motion degrade the image quality. However weight, volume and power constraints preclude stabilized operation of the cameras. Thus, the MTF needs to be measured in flight to insure it meets the resolution requirements. We apply the slanted-edge MTF method to the in-flight characterization of airborne high-resolution cameras, analyzing images of orthogonal slanted edges to estimate the motion and vibration contributions to the MTF, and show that the system exceeds the resolution requirements for surf-zone MCM. We also develop a methodology for scaling to other altitudes and speeds, and show that the system will perform well throughout its operational envelope. The slanted-edge method is more accurate and reproducible than the alternative of placing MTF bar targets under the aircraft flight path. Further, the slanted-edge targets are easier to deploy and recover, and ease the navigation tolerances.
The design, operation, and performance of the fourth generation of Science and Technology International's Advanced Airborne Hyperspectral Imaging Sensors (AAHIS) are described. These imaging spectrometers have a variable bandwidth ranging from 390-840 nm. A three-axis image stabilization provides spatially and spectrally coherent imagery by damping most of the airborne platform's random motion. A wide 40-degree field of view coupled with sub-pixel detection allows for a large area coverage rate. A software controlled variable aperture, spectral shaping filters, and high quantum efficiency, back-illuminated CCD's contribute to the excellent sensitivity of the sensors. AAHIS sensors have been operated on a variety of fixed and rotary wing platforms, achieving ground-sampling distances ranging from 6.5 cm to 2 m. While these sensors have been primarily designed for use over littoral zones, they are able to operate over both land and water. AAHIS has been used for detecting and locating submarines, mines, tanks, divers, camouflage and disturbed earth. Civilian applications include search and rescue on land and at sea, agricultural analysis, environmental time-series, coral reef assessment, effluent plume detection, coastal mapping, damage assessment, and seasonal whale population monitoring