We present a deep learning approach for restoring images degraded by atmospheric optical turbulence. We consider the case of terrestrial imaging over long ranges with a wide field-of-view. This produces an anisoplanatic imaging scenario where turbulence warping and blurring vary spatially across the image. The proposed turbulence mitigation (TM) method assumes that a sequence of short-exposure images is acquired. A block matching (BM) registration algorithm is applied to the observed frames for dewarping, and the resulting images are averaged. A convolutional neural network (CNN) is then employed to perform spatially adaptive restoration. We refer to the proposed TM algorithm as the block matching and CNN (BM-CNN) method. Training the CNN is accomplished using simulated data from a fast turbulence simulation tool capable of producing a large amount of degraded imagery from declared truth images rapidly. Testing is done using independent data simulated with a different well-validated numerical wave-propagation simulator. Our proposed BM-CNN TM method is evaluated in a number of experiments using quantitative metrics. The quantitative analysis is made possible by virtue of having truth imagery from the simulations. A number of restored images are provided for subjective evaluation. We demonstrate that the BM-CNN TM method outperforms the benchmark methods in the scenarios tested.
There are many approaches to incoherent imaging through the atmosphere that involve joint estimation of multiple turbulence-induced wavefront-aberration realizations and an object that is common across realizations. These approaches, all of which use short-exposure or “speckle” data, include Multi-Frame Blind Deconvolution (MFBD), Phase-Diverse Speckle (PDS), and Wavelength-Diverse Speckle (WDS). We enumerate fundamental estimation ambiguities that arise within each of these modalities and identify strategies to eliminate some of the ambiguities.
An extension of the fusion of interpolated frames superresolution (FIF SR) method to perform SR in the presence of atmospheric optical turbulence is presented. The goal of such processing is to improve the performance of imaging systems impacted by turbulence. We provide an optical transfer function analysis that illustrates regimes where significant degradation from both aliasing and turbulence may be present in imaging systems. This analysis demonstrates the potential need for simultaneous SR and turbulence mitigation (TM). While the FIF SR method was not originally proposed to address this joint restoration problem, we believe it is well suited for this task. We propose a variation of the FIF SR method that has a fusion parameter that allows it to transition from traditional diffraction-limited SR to pure TM with no SR as well as a continuum in between. This fusion parameter balances subpixel resolution, needed for SR, with the amount of temporal averaging, needed for TM and noise reduction. In addition, we develop a model of the interpolation blurring that results from the fusion process, as a function of this tuning parameter. The blurring model is then incorporated into the overall degradation model that is addressed in the restoration step of the FIF SR method. This innovation benefits the FIF SR method in all applications. We present a number of experimental results to demonstrate the efficacy of the FIF SR method in different levels of turbulence. Simulated imagery with known ground truth is used for a detailed quantitative analysis. Three real infrared image sequences are also used. Two of these include bar targets that allow for a quantitative resolution enhancement assessment.
The design of imaging systems involves navigating a complex trade space. As a result, many imaging systems employ focal plane arrays with a detector pitch that is insufficient to meet the Nyquist sampling criterion under diffraction-limited imaging conditions. This undersampling may result in aliasing artifacts and prevent the imaging system from achieving the full resolution afforded by the optics. Another potential source of image degradation, especially for long-range imaging, is atmospheric optical turbulence. Optical turbulence gives rise to spatially and temporally varying image blur and warping from fluctuations in the index of refraction along with optical path. Under heavy turbulence, the blurring from the turbulence acts as an anti-aliasing filter, and undersampling does not generally occur. However, under light to moderate turbulence, many imaging systems will exhibit both aliasing artifacts and turbulence degradation. Few papers in the literature have analyzed or addressed both of these degradations together. In this paper, we provide a novel analysis of undersampling in the presence of optical turbulence. Specifically, we provide an optical transfer function analysis that illustrates regimes where aliasing and turbulence are both present, and where they are not. We also propose and evaluate a super-resolution (SR) method for combating aliasing that offers robustness to optical turbulence. The method has a tuning parameter that allows it to transition from traditional diffraction-limited SR, to pure turbulence mitigation with no SR. The proposed method is based on Fusion of Interpolated Frames (FIF) SR, recently proposed by two of the current authors. We quantitatively evaluate the SR method with varying levels of optical turbulence using simulated sequences. We also presented results using real infrared imagery.
We present a block-matching and Wiener filtering approach to atmospheric turbulence mitigation for long-range imaging of extended scenes. We evaluate the proposed method, along with some benchmark methods, using simulated and real-image sequences. The simulated data are generated with a simulation tool developed by one of the authors. These data provide objective truth and allow for quantitative error analysis. The proposed turbulence mitigation method takes a sequence of short-exposure frames of a static scene and outputs a single restored image. A block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged, and the average image is processed with a Wiener filter to provide deconvolution. An important aspect of the proposed method lies in how we model the degradation point spread function (PSF) for the purposes of Wiener filtering. We use a parametric model that takes into account the level of geometric correction achieved during image registration. This is unlike any method we are aware of in the literature. By matching the PSF to the level of registration in this way, the Wiener filter is able to fully exploit the reduced blurring achieved by registration. We also describe a method for estimating the atmospheric coherence diameter (or Fried parameter) from the estimated motion vectors. We provide a detailed performance analysis that illustrates how the key tuning parameters impact system performance. The proposed method is relatively simple computationally, yet it has excellent performance in comparison with state-of-the-art benchmark methods in our study.
Differential tilt variance is a useful metric for interpreting the distorting effects of turbulence in incoherent imaging systems. In this paper, we compare the theoretical model of differential tilt variance to simulations. Simulation is based on a Monte Carlo wave optics approach with split step propagation. Results show that the simulation closely matches theory. The results also show that care must be taken when selecting a method to estimate tilts.
The Civil Air Patrol (CAP) is procuring Airborne Real-time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) systems to increase their search-and-rescue mission capability. These systems are being installed on a fleet of Gippsland GA-8 aircraft, and will position CAP to gain realworld mission experience with the application of hyperspectral sensor and processing technology to search and rescue. The ARCHER system design, data processing, and operational concept leverage several years of investment in hyperspectral technology research and airborne system demonstration programs by the Naval Research Laboratory (NRL) and Air Force Research Laboratory (AFRL). Each ARCHER system consists of a NovaSol-designed, pushbroom, visible/near-infrared (VNIR) hyperspectral imaging (HSI) sensor, a co-boresighted visible panchromatic high-resolution imaging (HRI) sensor, and a CMIGITS-III GPS/INS unit in an integrated sensor assembly mounted inside the GA-8 cabin. ARCHER incorporates an on-board data processing system developed by Space Computer Corporation (SCC) to perform numerous real-time processing functions including data acquisition and recording, raw data correction, target detection, cueing and chipping, precision image geo-registration, and display and dissemination of image products and target cue information. A ground processing station is provided for post-flight data playback and analysis. This paper describes the requirements and architecture of the ARCHER system, including design, components, software, interfaces, and displays. Key sensor performance characteristics and real-time data processing features are discussed in detail. The use of the system for detecting and geo-locating ground targets in real-time is demonstrated using test data collected in Southern California in the fall of 2004.
IR sensing has been a key enabling technology in military systems providing advantages in night vision, surveillance, and ever more accurate targeting. Passive hyperspectral imagin, the ability to gather and process IR spectral information from each pixel of an IR image, can ultimately provide 2D composition maps of a scene under study. FInding applications such as atmospheric, and geophysical remote sensing, camouflaged target recognition, and defence against chemical weapons.
Correlation performance of a computer-simulated binary joint transform correlator is investigated when the input image has been degraded because of blurring. Both constant and adaptive thresholding
were considered for comparison. In general, adaptive thresholding performed better in handling blurred input images both in the absence and presence of noise.