The analysis tools of traditional optical systems, such as modulation transfer functions, point spread functions, resolution test charts etc. are often not sufficient when analyzing computational imaging systems. Computational imaging systems benefit from the combined use of optics and electronics for accomplishing a given imaging or system task. In traditional optical systems the goal is essentially to form images that precisely depict a given object. Electronics are not required to form clear images, but could be required to analyze the images. In computational imaging systems specialized images are formed by generalized aspheric optical elements that are jointly optimized with the electronic processing. The specialized images formed at a detector are not necessarily clear images. Electronic processing is used to remove the image blur or otherwise form a final image. Computational imaging systems offer the advantage of increased performance and decreased size, weight, and cost over traditional optical systems.
The Ambiguity Function (AF), traditionally used for the design of radar waveforms, plays an important role in computational imaging systems. The AF provides a concise analysis of the optical transfer functions of imaging systems over defocus. The Wigner Distribution (WD), traditionally used for the design of time-varying systems, is related to the AF and provides a concise analysis of the point spread functions (PSF) of imaging systems over defocus. We will describe the relationships and utility of these functions to computational imaging systems.
Proc. SPIE. 5784, Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XVI
KEYWORDS: Long wavelength infrared, Signal to noise ratio, Optical design, Imaging systems, Sensors, Interference (communication), Wavefronts, Signal processing, Modulation transfer functions, Systems modeling
In a long wave infrared (LWIR) system there is the need to capture the maximum amount of information of objects over a broad volume for the identification and classification by the human or machine observer. In a traditional imaging system the optics limit the capture of this information to a narrow object volume. This limitation can hinder the observer's ability to navigate and/or identify friend or foe in combat or civilian operations. By giving the observer a larger volume of clear imagery their ability to perform will drastically improve. The system presented allows the efficient capture of object information over a broad volume and is enabled by a technology called Wavefront Coding. A Wavefront Coded system employs the joint optimization of the optics, detection and signal processing. Through a specialized design of the system’s optical phase, the system becomes invariant to the aberrations that traditionally limit the effective volume of clear imagery. In the process of becoming invariant, the specialized phase creates a uniform blur across the detected image. Signal processing is applied to remove the blur, resulting in a high quality image. A device specific noise model is presented that was developed for the optimization and accurate simulation of the system. Additionally, still images taken from a video feed from the as-built system are shown, allowing the side by side comparison of a Wavefront Coded and traditional imaging system.
Computational imaging systems are modern systems that consist of generalized aspheric optics and image processing capability. These systems can be optimized to greatly increase the performance above systems consisting solely of traditional optics. Computational imaging technology can be used to advantage in iris recognition applications. A major difficulty in current iris recognition systems is a very shallow depth-of-field that limits system usability and increases system complexity. We first review some current iris recognition algorithms, and then describe computational imaging approaches to iris recognition using cubic phase wavefront encoding. These new approaches can greatly increase the depth-of-field over that possible with traditional optics, while keeping sufficient recognition accuracy. In these approaches the combination of optics, detectors, and image processing all contribute to the iris recognition accuracy and efficiency. We describe different optimization methods for designing the optics and the image processing algorithms, and provide laboratory and simulation results from applying these systems and results on restoring the intermediate phase encoded images using both direct Wiener filter and iterative conjugate gradient methods.
Telescope performance is often limited by aberrations, and/or fabrication and alignment errors. Additionally, image formation in large space-based systems is sensitive to changes in physical form parameters such as temperature-related deformations, mirror structure, piston position and detector alignment. Changes in these parameters significantly degrade image quality and often limit the performance of the system. A fundamental new technology called Wavefront Coding has been successfully demonstrated via simulations for large space-based imaging systems that promise to surpass the performance attained by traditional optical designs. Wavefront Coding uses specialized aspheric optics and signal processing of the detected image to correct defocus-like aberrations thereby enabling a new paradigm in aberration balancing for telescope systems. Wavefront Coding can provide dramatic new mission capabilities by allowing space-based imaging systems that are simpler, lighter, and cheaper, while also providing high quality imagery in dynamic environments that are difficult or impossible to image in with traditional imaging systems. As an example two systems are presented that allow the telescope to repoint the boresight through the actuation of the primary segments or through the use of a scan mirror. Traditional systems with the same goal of repointing the boresight historically have not been feasible due to either the increased error space or due to constraints on system cost and complexity.
A long wave infrared (LWIR) computational imaging system has been designed and fabricated that has a decreased hyperfocal distance compared to traditional optics. Through the combination of aspheric optics and signal processing the near point with clear imagery has been reduced from 50m to less than 10m. Both systems deliver high quality imaging when the object is at infinity. The decrease in the hyperfocal distance was realized though the use of Wavefront Coding, a technology where all system components are jointly optimized. The system components include the optics, detector and signal processing. System optimization is used with optical/digital constraints such as manufacturability, cost, signal processing architecture, FPA characteristics, etc. Through a special design of the system’s optical phase, the system becomes invariant to the aberrations that traditionally limit the effective operational range. In the process of becoming invariant, the specialized phase creates a uniform blur across the detected image. Signal processing is applied to remove the blur, resulting in a high quality image. In this paper imagery from the Wavefront Coded system is described and compared to traditional imagery.
Proc. SPIE. 5407, Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XV
KEYWORDS: Long wavelength infrared, Signal to noise ratio, Point spread functions, Optical design, Imaging systems, Spatial frequencies, Sensors, Wavefronts, Signal processing, Modulation transfer functions
By jointly optimizing the design of optics, mechanics, and electronics systems with reduced size, weight, and cost can be realized. This joint optimization acts to increase the system trade-space compared to systems that optimize each component separately. Increasing the size of the system trade-space allows highly customized system design. An example of joint optimization is given for a LWIR imaging system with a conformal first surface. This example demonstrates an approximately 50% reduction in size, weight, and cost compared to acceptable traditional system solutions.
Understanding signal and noise quantities in any practical computational imaging system is critical. Knowledge of the imaging environment, optical parameters, and detector sensitivity determine the signal quantities but often noise quantities are assumed to be independent of the signal and either uniform or Gaussian additive. These simplistic noise models do not accurately model actual detectors. Accurate noise models are needed in order to design optimal systems. We describe a noise model for a modern APS CMOS detector and a number of noise sources that we will be measuring. A method for characterizing the noise sources given a set of dark images and a set of flat field images is outlined. The noise characterization data is then used to simulate dark images and flat field images. The simulated data is a very good match to the real data thus validating the model and characterization procedure.
A new methodology, called Wavefront Coding, allows the joint optimization of optics, mechanics, detection and signal processing of computational imaging systems. This methodology gives the system designer access to a large design trade space. This trade space can be exploited to enable the design of imaging systems that can image with high quality, with fewer physical components, lighter weight, and less cost compared to traditional optics. This methodology is described through an example conformal single lens IR imaging system. The example system demonstrates a 50% reduction in physical components, and an approximate 45% reduction in weight compared to a traditional two lens system.
Wavefront Coded imaging systems are jointly optimized optical and digital imaging systems that can increase the performance and/or reduce the cost of modern imaging systems by reducing the effects of aberrations. Aberrations that can be controlled through Wavefront Coding include misfocus, astigmatism, field curvature, chromatic aberration, temperature related misfocus, and assembly related misfocus. The design and simulation of these systems are based on a model that describes all of the important aspects of the optics, detector, and digital processing being used. These models allow theoretical calculation of ideal MTFs and signal processing related parameters for new systems. These parameters are explored for extended depth of field, field curvature, and temperature related misfocus effects.
In this paper a fast wide angle two element Wavefront Coded zoom lens is discussed. A traditional optics-only fast wide angle two element zoom system can not provide good imaging quality over a large field of view. Wavefront Coding techniques allow us to correct misfocus aberrations thus simplifying the optical system design while also providing good imaging quality. Imaging with an optical/digital system requires digital image processing that removes the spatial effect of Wavefront Coding.
We describe a logarithmic phase filter to extend the depth of field of incoherent optical hybrid imaging systems with a rectangular aperture. By introducing this filter at the system's exit pupil and digitally processing the detector's output, we were able to extend the depth of field by an order of magnitude more than the Hopkins defocus criterion.
Proc. SPIE. 4041, Visual Information Processing IX
KEYWORDS: Point spread functions, Digital signal processing, Imaging systems, Digital filtering, Silicon, Wavefronts, Digital imaging, Modulation transfer functions, Optical mounts, Temperature metrology
Many of the limitations of traditional optical-only imaging systems can be eliminated with jointly optimized optical and digital imaging systems. Jointly optimized optical and digital imaging systems exploit the complementary aspects of optics and digital signal processing to form systems with characteristics not possible with traditional optics-only systems. For example, in traditional imaging systems light gathering and large depth of field are competing goals and are inversely related. On the other hand, in optimized optical/digital imaging systems light gathering and large depth of field can be independent parameters. Instead of requiring a small aperture to produce a large depth of field, a large aperture and a large depth of field are both possible and practical. We can jointly optimized optical and digital imaging systems Wavefront Coded imaging systems. Concepts of Wavefront Coding are illustrated below through an athermalized, refractive, silicon/germanium IR imaging system with aluminum optical mounts subject to an ambient temperature range of -20 degree(s)C to +70 degree(s)C.
Proc. SPIE. 3779, Current Developments in Optical Design and Optical Engineering VIII
KEYWORDS: Code division multiplexing, Optical components, Optical transfer functions, Point spread functions, Digital signal processing, Image compression, Digital image processing, Imaging systems, Wavefronts, Modulation transfer functions
This paper gives a brief introduction into the background, application, and design of Wavefront Coding imaging systems. Wavefront Coding is a general technique of using generalized aspheric optics and digital signal processing to greatly increase the performance and/or reduce the cost of imaging systems. The type of aspheric optics employed results in optical imaging characteristics that are very insensitive to misfocus related aberrations. A sharp and clear image is not directly produced from the optics, however, digital signal processing applied to the sampled image produces a sharp and clear final image that is also insensitive to misfocus related aberrations. This paper gives an overview of Wavefront Coding and example images related to the two applications of machine vision/label reading and biometric imaging. Design techniques of Wavefront Coding are unique from that of traditional imaging system design since both the optics and digital processing characteristics of the system are jointly optimized for optimum system performance.
We have developed a completely new type of optical antialiasing filter that offers both high performance and low cost. This type of filter can sufficiently attenuate all spatial frequencies beyond the CCD detector bandlimit. Antialiasing filters currently in use typically just null out very narrow bands of spatial frequencies.
The traditional method of improving the depth-of-field of an imaging system is to stop down the aperture. Light gathering power and resolution are lost. We present a modified CCD camera system which achieves exceptionally high depth-of-field without stopping down the aperture. A special phase mask placed near the lens modifies the incoming wavefront making it nearly invariant to the state of focus. The resulting image has reduced contrast but still contains complete object information. Straightforward post-processing is then used to regain image contrast. In effect, we trade image SNR instead of aperture size to obtain high depth-of-field.
Proc. SPIE. 2730, Second Iberoamerican Meeting on Optics
KEYWORDS: Monochromatic aberrations, Optical transfer functions, Digital signal processing, Imaging systems, Spatial frequencies, Image restoration, Control systems, Image quality, Digital imaging, Focus stacking
We modify the optical transfer function of an incoherent imaging system such that it is nearly invariant to misfocus and contains no zero values. Digital filtering of the resulting image restores the image to near diffraction limited spatial resolution. A system that is insensitive to misfocus is also insensitive to some lens aberrations. Consequently, the approach of combining optical and digital signal processing system extends the normal aberration balancing of lens design to include those aberrations that can be reduced by digital signal processing, such as spherical aberration.
Proc. SPIE. 2537, Novel Optical Systems Design and Optimization
KEYWORDS: Optical components, Diffraction, Optical transfer functions, Point spread functions, Imaging systems, Spatial frequencies, Sensors, Digital filtering, Digital imaging, Modulation transfer functions
We provide experimental verification of the performance of an optical-digital imaging system that delivers near diffraction limited imaging over a wide depth of field. A custom aspheric optical element is used to modify an incoherent optical system so that the generated optical image is nearly independent of misfocus-induced blur. The resulting image, called an intermediate image, is not spatially diffraction limited. Digital processing of the intermediate image produces a final image that forms a close approximation to the diffraction limited image. The combined effect of the optical-digital system is to image objects independently of focus or range, that is, the system has an extended depth of field.
A system is described that will give the range to different portions of a scene. The optical system is modified such that the optical transfer function has periodic nulls, with the period of the nulls depending on the axial distance within the volume of the 3D scene. These same nulls appear in the spatial frequency spectrum of the images of objects that are at different distances within the volume of the scene. Estimation of the period of the nulls in the spatial frequency spectrum of the image then gives the ranges to different objects within the scene. A similar optical/digital processing technique is used to extend the depth of field of the composite system so that it will work over larger changes in range.