<p>Many optical systems are used for specific tasks such as classification. Of these systems, the majority are designed to maximize image quality for human observers. However, machine learning classification algorithms do not require the same data representation used by humans. We investigate the compressive optical systems optimized for a specific machine sensing task. Two compressive optical architectures are examined: an array of prisms and neutral density filters where each prism and neutral density filter pair realizes one datum from an optimized compressive sensing matrix, and another architecture using conventional optics to image the aperture onto the detector, a prism array to divide the aperture, and a pixelated attenuation mask in the intermediate image plane. We discuss the design, simulation, and trade-offs of these systems built for compressed classification of the Modified National Institute of Standards and Technology dataset. Both architectures achieve classification accuracies within 3% of the optimized sensing matrix for compression ranging from 98.85% to 99.87%. The performance of the systems with 98.85% compression were between an <italic>F</italic> / 2 and <italic>F</italic> / 4 imaging system in the presence of noise.</p>
We investigate the feasibility of additively manufacturing optical components to accomplish task-specific classification in a computational imaging device. We report on the design, fabrication, and characterization of a non-traditional optical element that physically realizes an extremely compressed, optimized sensing matrix. The compression is achieved by designing an optical element that only samples the regions of object space most relevant to the classification algorithms, as determined by machine learning algorithms. The design process for the proposed optical element converts the optimal sensing matrix to a refractive surface composed of a minimized set of non-repeating, unique prisms. The optical elements are 3D printed using a Nanoscribe, which uses two-photon polymerization for high-precision printing. We describe the design of several computational imaging prototype elements. We characterize these components, including surface topography, surface roughness, and angle of prism facets of the as-fabricated elements.
The detection, location, and identification of suspected underground nuclear explosions (UNEs) are global security priorities that rely on integrated analysis of multiple data modalities for uncertainty reduction in event analysis. Vegetation disturbances may provide complementary signatures that can confirm or build on the observables produced by prompt sensing techniques such as seismic or radionuclide monitoring networks. For instance, the emergence of non-native species in an area may be indicative of anthropogenic activity or changes in vegetation health may reflect changes in the site conditions resulting from an underground explosion. Previously, we collected high spatial resolution (10 cm) hyperspectral data from an unmanned aerial system at a legacy underground nuclear explosion test site and its surrounds. These data consist of visible and near-infrared wavebands over 4.3 km<sup>2</sup> of high desert terrain along with high spatial resolution (2.5 cm) RGB context imagery. In this work, we employ various spectral detection and classification algorithms to identify and map vegetation species in an area of interest containing the legacy test site. We employed a frequentist framework for fusing multiple spectral detections across various reference spectra captured at different times and sampled from multiple locations. The spatial distribution of vegetation species is compared to the location of the underground nuclear explosion. We find a difference in species abundance within a 130 m radius of the center of the test site.
Fog is a commonly occurring degraded visual environment which disrupts air traffic, ground traffic, and security imaging systems. For many application of interest, spatial resolution is required to identify elements of the scene. However, studying the effects of fog on resolution degradation is difficult because the composition of naturally occurring fogs is variable, and data collection is reliant on changing weather conditions. For our study, we used the Sandia National Laboratories fog facility to generate repeatable characterized fog conditions. Sandia’s well characterized fog generation allowed us to relate the resolution degradation of active and passive long-wave infrared (LWIR) imagers to the properties of fog. Additionally, the fogs we generated were denser than naturally occurring fogs. This allowed for testing of long range imaging in the shorter optical pathlengths obtainable in a laboratory environment.
In this presentation, we experimentally investigate the resolution degradation of LWIR wavelengths in realistic fog droplet sizes. Transmission of LWIR wavelengths has been studied extensively in literature. To date however, there are few experimental results quantifying the resolution degradation for LWIR imagery in fog. We present experimental results on resolution degradation for both passive and active LWIR systems. The degradation of passive imaging was measured using 37˚C blackbody with a slant edge resolution targets. The active imaging resolution degradation was measured using a polarized CO2 laser reflecting off a set of bar targets. We found that the relationship between meteorological optical range and resolution degradation was more complicated than described purely by attenuation.
Heavy fogs and other highly scattering environments pose a challenge for many commercial and national security sensing systems. Current autonomous systems rely on a range of optical sensors for guidance and remote sensing that can be degraded by highly scattering environments. In our previous and on-going simulation work, we have shown polarized light can increase signal or range through a scattering environment such as fog. Specifically, we have shown circularly polarized light maintains its polarized signal through a larger number of scattering events and thus range, better than linearly polarized light. In this work we present design and testing results of active polarization imagers at short-wave infrared and visible wavelengths. We explore multiple polarimetric configurations for the imager, focusing on linear and circular polarization states. Testing of the imager was performed in the Sandia Fog Facility. The Sandia Fog Facility is a 180 ft. by 10 ft. chamber that can create fog-like conditions for optical testing. This facility offers a repeatable fog scattering environment ideally suited to test the imager’s performance in fog conditions. We show that circular polarized imagers can penetrate fog better than linear polarized imagers.
This paper attempts to quantify thermal infrared (both longwave and midwave), shortwave infrared, and visible-light sensor performance under different test-chamber fogs. We find that the performance of LWIR imaging is impacted significantly less by light-to-moderate fog than the other two IR sensors and the visible imager. The paper recommends additional fog chamber tests that will be useful for the development of imaging simulation capability that accurately models fog across these wavebands.
Many optical systems are used for specific tasks such as classification. Of these systems, the majority are designed to maximize image quality for human observers; however, machine learning classification algorithms do not require the same data representation used by humans. In this work we investigate compressive optical systems optimized for a specific machine sensing task. Two compressive optical architectures are examined: an array of prisms and neutral density filters where each prism and neutral density filter pair realizes one datum from an optimized compressive sensing matrix, and another architecture using conventional optics to image the aperture onto the detector, a prism array to divide the aperture, and a pixelated attenuation mask in the intermediate image plane. We discuss the design, simulation, and tradeoffs of these compressive imaging systems built for compressed classification of the MNSIT data set. To evaluate the tradeoffs of the two architectures, we present radiometric and raytrace models for each system. Additionally, we investigate the impact of system aberrations on classification accuracy of the system. We compare the performance of these systems over a range of compression. Classification performance, radiometric throughput, and optical design manufacturability are discussed.
The scattering of light in fog is a complex problem that affects imaging in many ways. Typically, imaging device performance in fog is attributed solely to reduced visibility measured as light extinction from scattering events. We present a quantitative analysis of resolution degradation in the long-wave infrared regime. Our analysis is based on the calculation of the modulation transfer function from the edge response of a slant edge blackbody target in known fog conditions. We show higher spatial frequencies attenuate more than low spatial frequencies with increasing fog thickness. These results demonstrate that image blurring, in addition to extinction, contributes to degraded performance of imaging devices in fog environments.
Compact snapshot imaging polarimeters have been demonstrated in literature to provide Stokes parameter estimations for spatially varying scenes using polarization gratings. However, the demonstrated system does not employ aggressive modulation frequencies to take full advantage of the bandwidth available to the focal plane array. A snapshot imaging Stokes polarimeter is described and demonstrated through results. The simulation studies the challenges of using a maximum bandwidth configuration for a snapshot polarization grating based polarimeter, such as the fringe contrast attenuation that results from higher modulation frequencies. Similar simulation results are generated and compared for a microgrid polarimeter. Microgrid polarimeters are instruments where pixelated polarizers are superimposed onto a focal plan array, and this is another type of spatially modulated polarimeter, and the most common design uses a 2x2 super pixel of polarizers which maximally uses the available bandwidth of the focal plane array.
Ground-based, low-cost, uncooled infrared imagers are specially calibrated and deployed for long-term measurements of spatial and temporal cloud statistics. Measurements of cloud optical depth are shown for thin clouds, and validated with a dual-polarization cloud lidar. Good comparisons are achieved for thin clouds having 550-nm optical depth of 3 or less.