In this article we study the use of the cubic-phase pupil function for extension of the depth of field of task-based imaging systems. In task-based design problems the resolution of interest varies as a function of the object distance due to change in magnification. This introduces a new challenge in the design process. We discuss how the optimal design criterion of task-based imaging systems is fundamentally different from that of visual imaging systems and formulate the optimization problem. We discuss how the use of the cubic-phase pupil function changes the spectral signal-to-noise-ratio (SNR) and modulation transfer function (MTF) in the range of the depth of field in order to fulfill our design requirements. We introduce an approximation to the problem of maximizing SNR and show that it is amenable to analytic treatment. We derive an explicit expression for the optimized cubic-phase pupil function parameters for a general problem of this class, thus establishing an upper bound for the extension of the depth of field using cubic-phase Wavefront Coding.
The human iris is an attractive biometric due to its high discrimination capability. However, capturing good quality images of human irises is challenging and requires considerable user cooperation. Iris capture systems with large depth of field, large field of view and excellent capacity for light capture can help considerably in such scenarios. In this paper we apply Wavefront Coding to increase the depth of field without increasing the optical F/# of an iris recognition system when the subject is at least 2 meters away. This computational imaging system is designed and optimized using the spectral-SNR as the fundamental metric. We present simulation and experimental results that show the benefits of this technology for biometric identification.
The analysis tools of traditional optical systems, such as modulation transfer functions, point spread functions, resolution test charts etc. are often not sufficient when analyzing computational imaging systems. Computational imaging systems benefit from the combined use of optics and electronics for accomplishing a given imaging or system task. In traditional optical systems the goal is essentially to form images that precisely depict a given object. Electronics are not required to form clear images, but could be required to analyze the images. In computational imaging systems specialized images are formed by generalized aspheric optical elements that are jointly optimized with the electronic processing. The specialized images formed at a detector are not necessarily clear images. Electronic processing is used to remove the image blur or otherwise form a final image. Computational imaging systems offer the advantage of increased performance and decreased size, weight, and cost over traditional optical systems.
The Ambiguity Function (AF), traditionally used for the design of radar waveforms, plays an important role in computational imaging systems. The AF provides a concise analysis of the optical transfer functions of imaging systems over defocus. The Wigner Distribution (WD), traditionally used for the design of time-varying systems, is related to the AF and provides a concise analysis of the point spread functions (PSF) of imaging systems over defocus. We will describe the relationships and utility of these functions to computational imaging systems.
Iris recognition imaging is attracting considerable interest as a viable alternative for personal identification and verification in many defense and security applications. However current iris recognition systems suffer from limited depth of field, which makes usage of these systems more difficult by an untrained user. Traditionally, the depth of field is increased by reducing the imaging system aperture, which adversely impacts the light capturing power and thus the system signal-to-noise ratio (SNR). In this paper we discuss a computational imaging system, referred to as Wavefront Coded(R) imaging, for increasing the depth of field without sacrificing the SNR or the resolution of the imaging system. This system employs a especially designed Wavefront Coded lens customized for iris recognition. We present experimental results that show the benefits of this technology for biometric identification.
Computational imaging systems are modern systems that consist of generalized aspheric optics and image processing capability. These systems can be optimized to greatly increase the performance above systems consisting solely of traditional optics. Computational imaging technology can be used to advantage in iris recognition applications. A major difficulty in current iris recognition systems is a very shallow depth-of-field that limits system usability and increases system complexity. We first review some current iris recognition algorithms, and then describe computational imaging approaches to iris recognition using cubic phase wavefront encoding. These new approaches can greatly increase the depth-of-field over that possible with traditional optics, while keeping sufficient recognition accuracy. In these approaches the combination of optics, detectors, and image processing all contribute to the iris recognition accuracy and efficiency. We describe different optimization methods for designing the optics and the image processing algorithms, and provide laboratory and simulation results from applying these systems and results on restoring the intermediate phase encoded images using both direct Wiener filter and iterative conjugate gradient methods.
This paper demonstrates a space integrating optical implementation of a single-layer FIRNN. A scrolling spatial light modulator is used for representing the spatio-temporal input plane, while the weights are implemented by the adaptive grating formation in a photorefractive crystal. Differential heterodyning is used for low-noise bipolar output detection and an active stabilization technique using a lock-in amplifier and a piezo-electric actuator is adopted for long term interferometric stability. Simulations and initial experimental results for adaptive sonar broadband beamforming are presented.
We present an adaptation of the BEAMTAP (Broadband and Efficient Adaptive Method for True-time-delay Array Processing) algorithm, previously developed for wideband phased array radars, to lower bandwidth applications such as sonar. This system utilizes the emerging time or wavelength multiplexed optical hydro-phone sensors and processes the cohered array of signals in the optical domain without conversion to the electronic domain or digitization. Modulated signals from an optical hydro-phone array are pre- processed then imaged through a photorefractive crystal where they interfere with a reference signal and its delayed replicas. The diffraction of the sonar signals off these adaptive weight gratings and detection on a linear time- delay-and-integrate charge coupled device (TDI CCD) completes the true-time-delay (TTD) beamforming process. Optical signals focused on different regions of the TDI CCD accumulate the appropriate delays necessary to synchronize and coherently sum the acoustic signals arriving at various angles on the hydro-phone array. In this paper, we present an experimental demonstration of TTD processing of low frequency signals (in the KHz sonar regime) using a TDI CCD tapped delay line. Simulations demonstrating the performance of the overall system are also presented.
We present a time and space integrating optical architecture for multi-layer finite impulse response neural networks (FIRNN). The proposed architecture is capable of forward propagation and on-line learning in the form of backward propagation. FIRNNs are first presented and analyzed. From the analysis it is observed that the implementation of FIRNNs requires the calculation of temporal convolutions, which inspire the use of time-integrating and space- integrating optical architectures. A novel device is proposed for the space-integrating architecture, based on the use of a rotating volume hologram. Initially, two single-layer architectures based on the space integrating and time integrating architectures are presented, leading to the multi-layer architecture, which uses a combination of both architectures, folding them together in such a way that all the operations of the order O(N<SUP>3</SUP>) are performed optically and only the less computationally intensive operations are performed electronically.
We present an all-optical architecture for a fully adaptive antenna array processor capable of optimally processing the signals from very large arrays in the presence of high frequency and wideband signals. A modified version of the least mean square algorithm is employed using the BEAMTAP (Broadband and Efficient Adaptive Method for True-time-delay Array Processing) architecture. A dynamic photorefractive volume hologram is used for the adaptive weights and two cohered fiber arrays are used as tapped-delay-lines at the output and feedback paths, allowing for the processing of signals at bandwidths exceeding 10 GHz. The optical cohering of the fiber arrays is discussed and simulations are shown which describe the performance of the proposed architecture in the presence of broadband signals and multiple broadband jammers.
We present the analytical description of a photorefractive phased array beamforming system using the BEAMTAP (Broadband and Efficient Adaptive Method for True-Time-Delay Array Processing) algorithm for a large N-element array that requires only 2 tapped delay lines (TDLs) instead of the conventional N TDLs. Simulation results indicate that the processor is able to adapt to a broadband signal of interest at a specific angle of arrival. We show that the system produces a coherent sum of the desired signals from the phased array, with the corresponding time delays appropriately compensated for in an adaptive fashion without prior knowledge of the angle-of-arrival.