This paper attempts to quantify thermal infrared (both longwave and midwave), shortwave infrared, and visible-light sensor performance under different test-chamber fogs. We find that the performance of LWIR imaging is impacted significantly less by light-to-moderate fog than the other two IR sensors and the visible imager. The paper recommends additional fog chamber tests that will be useful for the development of imaging simulation capability that accurately models fog across these wavebands.
Autonomous vehicle sensor suites must perform in a variety of weather conditions to achieve acceptable levels of safety and reliability. Fog is one of the most challenging driving conditions. This paper presents qualitative performance data of thermal infrared (both longwave and midwave), shortwave infrared, and visible-light imaging sensors under different testchamber fogs. We find that the performance of LWIR imaging is impacted significantly less by light-to-moderate fog than the other two IR sensors, the visible imager, and a low-resolution Velodyne LiDAR. The paper recommends additional fog chamber testing to generate data that will be useful for the development of imaging simulation capability that accurately models fog across these wavebands for improved reliability and coverage in the development of ADAS and autonomous vehicle (AV) vision systems.
The SWIR waveband between 0.8μm-1.8μm is getting increasingly exploited by imaging systems in a variety of different applications, including persistent imaging for security and surveillance of high-value assets, handheld tactical imagers, range-gated imaging systems and imaging LADAR for driverless vehicles. The vast majority of these applications utilize lattice-matched InGaAs detectors in their imaging sensors, and these sensors are rapidly falling in price, leading to their widening adoption. As these sensors are used in novel applications and locations, it is important that ambient SWIR backgrounds be understood and characterized for a variety of different field conditions, primarily for the purposes of system performance modeling of SNR and range metrics. SWIR irradiance backgrounds do not consistently track visible-light illumination at all. There is currently little of this type of information in the open literature, particularly measurements of SWIR backgrounds in urban areas, natural areas, or indoors. This paper presents field measurements done with an InGaAs detector calibrated in the swux unit of InGaAs-band-specific irradiance proposed by two of the authors in 2017. Simultaneous measurements of illuminance levels (in lux) at these sites are presented, as well as visible and InGaAs camera images of the scenery at some of these measurement sites. The swux and lux measurement hardware is described, along with the methods used to calibrate it. Finally, the swux levels during the partial and total phases of the total solar eclipse of 2017 are presented, along with curves fitted to the data from a theoretical model, based on obscuration of the sun by the moon. The apparent differences between photometric and swux measurements will be discussed.
We propose a new radiometric unit of measure we call the ‘swux’ to unambiguously characterize scene illumination in the SWIR spectral band between 0.8μm-1.8μm, where most of the ever-increasing numbers of deployed SWIR cameras (based on standard InGaAs focal plane arrays) are sensitive. Both military and surveillance applications in the SWIR currently suffer from a lack of a standardized SWIR radiometric unit of measure that can be used to definitively compare or predict SWIR camera performance with respect to SNR and range metrics. We propose a unit comparable to the photometric illuminance lux unit; see Ref. . The lack of a SWIR radiometric unit becomes even more critical if one uses lux levels to describe SWIR sensor performance at twilight or even low light condition, since in clear, no-moon conditions in rural areas, the naturally-occurring SWIR radiation from nightglow produces a much higher irradiance than visible starlight. Thus, even well-intentioned efforts to characterize a test site’s ambient illumination levels in the SWIR band may fail based on photometric instruments that only measure visible light. A study of this by one of the authors in Ref.  showed that the correspondence between lux values and total SWIR irradiance in typical illumination conditions can vary by more than two orders of magnitude, depending on the spectrum of the ambient background. In analogy to the photometric lux definition, we propose the SWIR irradiance equivalent ‘swux’ level, derived by integration over the scene SWIR spectral irradiance weighted by a spectral sensitivity function S(λ), a SWIR analog of the V(λ) photopic response function.
Type II Strained-Layer Superlattice detectors are currently being incorporated into hybridized infrared camera focal plane arrays for commercial applications. The detectors offer significant advantages over InSb and MCT detectors for certain application spaces, particularly high-speed imaging for industrial purposes, and military test ranges. The advantage over MWIR InSb sensors is driven by blackbody physics, which results in much higher emitted photon radiance values for target temperatures around ambient, as well as increased temperature dynamic range as a result of the lower thermal derivative in the LWIR band relative to the MWIR band.
Austin Richards takes readers on a visual tour of the electromagnetic spectrum beyond the range of human sight, using imaging technology as the means to "see" invisible light. Dozens of colorful images and clear, concise descriptions make this an intriguing, accessible technical book. Richards explains the light spectrum, including visible light, and describes the advanced imaging technologies that enable humans to synthesize our own version of "alien" vision at different wavelengths, with applications ranging from fire fighting and law enforcement to botany and medicine.
The second edition expands existing content, explores recent areas of research, and provides new illustrations that demonstrate the variability of vision throughout the spectrum.
We have constructed a novel filter wheel camera that allows filters to be rapidly and sequentially introduced
into the optical path of a high-performance NIR (near-infrared) camera based on a staring focal-plane array (FPA) made
with indium gallium arsenide (InGaAs) detectors. The filter wheel is populated with neutral density filters ranging from
a transmission of 0.97 (essentially no attenuation) to ~10-5 (an ND5 filter stack). The camera acquires images with
increasing attenuation of signal in cycles of six images called subframes. Those images are collapsed into a single
radiometrically-calibrated image (called a superframe) with a greatly extended dynamic range. In the current
configuration of the system, the radiance dynamic range is about 4x106, which is equivalent to 22 bits, a significant
enhancement over the nominal 14-bit dynamic range of the camera core. This extended range makes it possible to make
radiometric measurements on low ambient light scenes with tremendous variability of temperature or radiance, such as
rocket launches, laser beams and intense flames. It is also possible to image scenes with high ambient near-infrared light
levels, such as landscape on bright, sunny days without having to dynamically adjust exposure. Since the wheel rotates
at high speed (15 Hz), the resulting dataset of six-frame cycles can be reduced to a superframe movie sequence with 15
Hz frame rate, making it possible to image spatially-changing scenes such as rocket launches with good image
registration between subframes in a given cycle.
Radiometric infrared camera systems are often used at test ranges to characterize the IR signature of targets such as aircraft or rockets through significant air columns that reduce the received signal through a combination of absorption and scattering. The dominant effect in clear air is molecular resonant absorption which is particularly strong in the midwave IR band (3-5 microns) for carbon dioxide and water vapor. Tactical targets can be imaged at standoff distances up to 1000km or more, but there are many cases where these targets are within a 1km range, as is the case with a close-in flyby at a test range. Therefore it is useful to model the short-range atmospheric transmission to predict its effect on radiometric measurements. Many industrial processes that occur in large outdoor facilities also lend themselves to radiometric measurement for standoff ranges of tens or hundreds of meters. This paper compares experimental radiometric data taken at ranges under 1km to a theoretical model of the atmosphere, and describes a simple method for correcting for air column effects at these relatively short ranges. The data were collected in the 3-5 micron band using an indium antimonide staring-array camera and a long focal length lens combined with radiometric analysis software. The system was calibrated to measure target radiances, but can also be used to estimate target temperatures in cases where the in-band emissivity of the target is well understood. The radiometric data are compared to a model built on MODTRAN code, with conclusions about the attenuation introduced by the atmosphere for standard medium-range imaging systems in "typical" observing conditions. Effects caused by the MTF of the lens system are studied briefly, and used to set limits on the minimum number of pixels the target can subtend and still have an accurately measurable radiance.
Radiometric infrared camera systems are most often used to characterize the IR signature of targets (often an aircraft or rocket) through significant air paths that reduce the received signal. Tactical targets can be imaged at standoff distances up to 1000km or more, but there are many cases where the target is within 1km range, as is the case with a close-in flyby at a test range. This paper compares experimental radiometric data to a theoretical model of the atmosphere. The radiometric data was collected in the 3-5 micron band using an indium antimonide staring-array camera and long focal length lens combined with radiometric analysis software. The system was calibrated to measure target radiances, but can also be used to estimate target temperatures in cases where the in-band target emissivity is well understood. The radiometric data are compared to a model built on MODTRAN code, with conclusions about the attenuation introduced by the atmosphere for standard medium-range imaging systems in “typical” observing conditions.
Infrared cameras are often used to capture high-speed digital video of scenes with enormous ranges in in-band brightness. A simple example of this is a rocket launch, a scene which can consist of a cold rocket hardbody and an extremely hot exhaust plume. It can be next to impossible to fully span a scene like this with the brightness dynamic range of an infrared camera (typically ~12-14 bits) at a single exposure value. The brightest or hottest parts of the image will often be saturated, while at the same time the darkest or coldest parts of the scene may be buried in the noise floor of the camera and appear black in the image. Varying the exposure by adjusting the camera to an optimal shutter speed or integration time is necessary to maximize the useful information recorded by the camera. Sometimes, however, a single integration time is not enough to fully encompass a scene’s brightness (temperature) variations. The technique of superframing gets around this problem by exploiting the capabilities of high frame-rate IR cameras. The technique involves cycling a camera through a set of integration times on a frame-by-frame basis, then combines the resulting “subframes” into single “superframes” with greatly extended dynamic ranges. If the frame rate is sufficiently high, the scene will not change appreciably from one subframe to the next. The technique and some sample data are described in this paper.
The phrase high-speed imaging is generally associated with short exposure times, fast frame rates or both. Supersonic projectiles, for example, are often impossible to see with the unaided eye, and require strobe photography to stop their apparent motion. It is often necessary to image high-speed objects in the infrared region of the spectrum, either to detect them or to measure their surface temperature. Conventional infrared cameras have time constants similar to the human eye, so they too, are often at a loss when it comes to photographing fast-moving hot targets. Other types of targets or scenes such as explosions change very rapidly with time. Visualizing those changes requires an extremely high frame rate combined with short exposure times in order to slow down a dynamic event so that it can be studied and quantified. Recent advances in infrared sensor technology and computing power have pushed the envelope of what is possible to achieve with commercial IR camera systems.
Infrared cameras are often used to capture high-speed digital video of scenes with enormous ranges in in-band brightness. A simple example of this would be a man standing next to a hot fire. Under normal operating conditions, it can be next to impossible to fully span a scene like this with the brightness dynamic range of an infrared camera. The brightest or hottest parts of the image will often be saturated, while at the same time the darkest or coldest parts of the scene may be buried in the noise floor of the camera and appear black in the image. Varying the exposure by changing the integration time is necessary to maximize the useful information recorded by the camera, but sometimes a single integration time is not enough to fully encompass a scene's variations. The technique of superframing consists of varying the integration time of the camera from frame to frame in a cyclic manner, then combining the resulting subframes into single superframes with greatly extended dynamic ranges. The technique and some sample data are described in this paper.
Spectral selection is a powerful technique for enhancing standard infrared imaging systems that have sensitivity over a broad range of the infrared spectrum. The uses of enhanced systems include imaging objects that typically appear transparent to a standard broadband IR imaging system, or to image through materials that would typically appear opaque. Spectral selection can also be used to detect the presence of various chemical species, and to measure their concentration in the atmosphere, or in liquid and solid materials. Spectral selection can be achieved through the use of filters or through the use of a filtered illumination source. This paper briefly describes various applications for imaging cameras based on InGaAs, InSb, and QWIP focal plane arrays in conjunction with filters
that are both fixed and tunable.
An ultra-low-noise readout IC originally designed for low-background imaging when hybridized with indium gallium arsenide (InGaAs) detectors has been combined with indium antimonide (InSb) detectors instead. This novel focal plane array operates in the 3-5 micron waveband and is capable of imaging the very low backgrounds encountered at extremely short exposure or integration times. Combining the FPA with specialized support electronics that enable precision triggering has resulted in a commercially-available camera system that can take stop-motion thermal images of explosions, supersonic bullets and other fast projectiles without the need for
rotating mirrors or other optomechanical assemblies that are required in a scanning or streak camera system. The camera system can be easily calibrated to measure the in-band radiance of these objects, as well as enabling estimates of their surface temperature based on laboratory measurements of emissivity.
High-performance thermal imaging cameras based on indium antimonide (InSb) focal plane arrays (FPAs) offer excellent sensitivity in the midwave infrared band, notably in the 3-5 micron waveband. Noise levels below 20 mK enable detection of surface temperature differences of 0.1 C, and the high-speed response of the InSb photodetectors enablesv the capture of events on time scale as short as 5 microseconds.
Recent advances in ultra-high performance InGaAs focal plane array (FPA) technology has enabled many new imaging applications in diverse fields. This paper briefly describes the InGaAs FPA technology developed by Indigo Systems Corporation, outlines the performance specifications of a new camera based on this InGaAs FPA, and highlights some of the applications for this technology, including laser beam and material characterization and NIR imaging spectroscopy.
SC1000: <strong></strong>Introduction to Infrared and Ultraviolet Imaging Technology
Information on applications for infrared and ultraviolet imaging systems tends to be scanty and widely dispersed.
This is because camera manufacturers tend focus on the products themselves, not applications. It is also because most textbooks on IR and UV technology are outdated and tend to emphasize the basics of radiometry and detection by single detectors, not imaging applications. Finally, industrial users of these cameras are often close-mouthed about what they are doing with them. <br/>
This course gives a non-technical overview of commercial infrared and ultraviolet camera systems, the "taxonomy" of infrared and ultraviolet wavebands, and the wide variety of applications for these wavebands. The course relies heavily on interesting imagery captured by the presenter over the last ten years and uses a SPIE monograph written by the author as a supplementary textbook. <br/>
The course will cover a wide range of IR and UV applications, including:
<li> CCDs and CMOS detectors </li>
<li> Infrared focal plane arrays </li>
<li> Cooled and uncooled infrared sensors </li>
<li> Infrared Radiometry </li>
<li> Thermography </li>
<li> Industrial inspection </li>
<li> Research and Development applications </li>
<li> Corona detection and shortwave UV imaging </li>
SC950: Infrared Imaging Radiometry with Applications to Target Measurement
This course will enable the user to understand how an infrared camera system can be calibrated to measure radiance, radiant intensity and apparent temperatures of targets and scenes, and how the camera’s digital data is converted into radiometric data. The user will learn how to perform their own external, "by hand" calibrations on a science-grade infrared camera system using area or cavity blackbodies and an Excel spreadsheet provided by the instructor. The influences of lenses, ND and bandpass filters, windows, emissivity, reflections and atmospheric absorption on the system calibration will be covered. The instructor will use software to illustrate these concepts and will show how to measure emissivity using an infrared camera and how to predict system performance outside the calibration range.
This course provides attendees with an overview of the diverse range of applications for NIR and SWIR imaging systems and how these systems are calibrated and characterized. The emphasis is on the capabilities of InGaAs and InSb sensors operating in the 0.7 to 3.0 micron NIR and SWIR bands with discussions of optics and tunable filter technology. Discussion will also include extended InGaAs and VisGaAs, a sensor material with both visible and NIR response.
From near-infrared security cameras above your front door, to thermal infrared camera accessories that mount to smartphones, infrared imaging technology is everywhere in 2017. But there is still confusion and misinformation about what it is and what it can and cannot do. This 2-hour, high-level introduction to the topic, with minimal math or physics knowledge required, is for the growing number of non-specialists who need to understand infrared imaging technology and its many applications. The presentation materials consist of infrared images from the instructor’s extensive library, the stories these images tell us, how they are made and how the technology and the phenomena it captures relates to the more familiar realm of visible-light cameras and human vision.
SC396: Imaging Throughout the Electromagnetic Spectrum
This short course is a presentation of hundreds of images taken with electronic sensors that cover the spectrum from radio to gamma rays, along with discussions of the sensor technology used to make the images and its applications. The course is non-technical in nature, but will be of interest to imaging professionals and laypersons alike.