The quality of an imaging system can be assessed through controlled laboratory objective measurements. Currently, all imaging measurements require some form of digitization in order to evaluate a metric. Depending on the device, the amount of bits available, relative to a fixed dynamic range, will exhibit quantization artifacts. From a measurement standpoint, measurements are desired to be performed at the highest possible bit-depth available. In this correspondence, we described the relationship between higher and lower bit-depth measurements. The limits to which quantization alters the observed measurements will be presented. Specifically, we address dynamic range, MTF, SiTF, and noise. Our results provide guidelines to how systems of lower bit-depth should be characterized and the corresponding experimental methods.
Typically, a system level characterization of a thermal imaging device includes characterizing the objective optics, detector and readout electronics. Ultimately, the thermal imagery is converted to an 8-bit signal and presented on a display for human visual consumption. In some situations, direct characterization of the pre-sample imaging system is not possible, and measurements must be performed from analyzing the output from its display. Additionally, the performance of the display and display optics are significant contributors to the performance of the imaging system, yet they are both assumed to be ideal in many aspects. In this paper, we describe how the underlying imaging system non-uniformity is related to additional display contributions in the total system non-uniformity. This paper will be divided into three parts: the technique and considerations needed to properly measure system through its display, how we can use this information in the NVIPM performance model, and a comparison of performance from measurements at the pre-sample readout versus measurements only at the display.
When new and unique task difficulties are requested to be determined, it is important to use methodologies that are consistent with previous research. Unfortunately, some new tasks break the paradigm of past research and require new techniques in order to properly determine their difficulty. This paper describes the process of determining the difficulty for tasks that are unique in that they have a null case (where no object or motion is present) and because these tasks have been requested to be quantified in environments that potentially contain high amounts of atmospheric turbulence. Because each of the calculated V50’s was based upon an assumption, a secondary field collection was necessary in order to validate which model assumptions correlated properly to field performance data.
Thermal systems with a narrow spectral bandpass and mid-wave thermal imagers are useful for a variety of imaging
applications. Additionally, the sensitivity for these classes of systems is increasing along with an increase in
performance requirements when evaluated in a lab. Unfortunately, the uncertainty in the blackbody temperature
along with the temporal instability of the blackbody could lead to uncontrolled laboratory environmental effects
which could increase the measured noise. If the temporal uncertainty and accuracy of a particular blackbody
is known, then confidence intervals could be adjusted for source accuracy and instability. Additionally, because
thermal currents may be a large source of temporal noise in narrow band systems, a means to mitigate them is
presented and results are discussed.
Laboratory measurements on thermal imaging systems are critical to understanding their performance in a field
environment. However, it is rarely a straightforward process to directly inject thermal measurements into thermal
performance modeling software to acquire meaningful results. Some of the sources of discrepancies between
laboratory and field measurements are sensor gain and level, dynamic range, sensor display and display brightness,
and the environment where the sensor is operating. If measurements for the aforementioned parameters could
be performed, a more accurate description of sensor performance in a particular environment is possible. This
research will also include the procedure for turning both laboratory and field measurements into a system model.
When evaluated with a spatially uniform irradiance, an imaging sensor exhibits both spatial and temporal variations,
which can be described as a three-dimensional (3D) random process considered as noise. In the 1990s, NVESD
engineers developed an approximation to the 3D power spectral density (PSD) for noise in imaging systems known as
3D noise. In this correspondence, we describe how the confidence intervals for the 3D noise measurement allows for
determination of the sampling necessary to reach a desired precision. We then apply that knowledge to create a smaller
cube that can be evaluated spatially across the 2D image giving the noise as a function of position. The method
presented here allows for both defective pixel identification and implements the finite sampling correction matrix. In
support of the reproducible research effort, the Matlab functions associated with this work can be found on the
Mathworks file exchange .
Researchers at the US Army Night Vision and Electronic Sensors Directorate have added the functionality of Machine Vision MRT (MV-MRT) to the NVLabCap software package. While the original calculations of MV-MRT were compared to human observers performance using digital imagery in a previous effort,<sup>1</sup> the technical approach was not tested on 8-bit imagery using a variety of sensors in a variety of gain and level settings. Now that it is more simple to determine the MV-MRT for a sensor in multiple gain settings, it is prudent to compare the results of MV-MRT in multiple gain settings to the performance of human observers for thermal imaging systems that are linear and shift invariant. Here, a comparison of the results for a LWIR system to trained human observers is presented.
Engineers at the US Army Night Vision and Electronic Sensors Directorate have recently developed a software package called NVLabCap. This software not only captures sequential frames from thermal and visible sensors, but it also can perform measurements of signal intensity transfer function, 3-dimensional noise, field of view, super-resolved modulation transfer function, and image bore sight. Additionally, this software package, along with a set of commonly known inputs for a given thermal imaging sensor, can be used to automatically create an NV-IPM element for that measured system. This model data can be used to determine if a sensor under test is within certain tolerances, and this model can be used to objectively quantify measured versus given system performance.
The GStreamer architecture allows for simple modularized processing. Individual GStreamer elements have been
developed that allow for control, measurement, and ramping of a blackbody, for capturing continuous imagery
from a sensor, for segmenting out a MRTD target, for applying a blur equivalent to that of a human eye and a
display, and for thresholding a processed target contrast for "calling" it. A discussion of each of the components
will be followed by an analysis of its performance relative to that of human observers.
Multiple source band image fusion can sometimes be a multi-step process that consists of several intermediate
image processing steps. Typically, each of these steps is required to be in a particular arrangement in order to
produce a unique output image. GStreamer is an open source, cross platform multimedia framework, and using
this framework, engineers at NVESD have produced a software package that allows for real time manipulation
of processing steps for rapid prototyping in image fusion.
Recent developments in image fusion give the user community many options for ways of presenting the imagery to
an end-user. Individuals at the US Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate
have developed an electronic system that allows users to quickly and efficiently determine optimal image fusion
algorithms and color parameters based upon collected imagery and videos from environments that are typical
to observers in a military environment. After performing multiple multi-band data collections in a variety
of military-like scenarios, different waveband, fusion algorithm, image post-processing, and color choices are
presented to observers as an output of the fusion system. The observer preferences can give guidelines as to how
specific scenarios should affect the presentation of fused imagery.
The US Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) sensor performance
models predict the ability of soldiers to perform a specified military discrimination task using an EO/IR sensor system.
Increasingly EO/IR systems are being used on manned and un-manned aircraft for surveillance and target acquisition
tasks. In response to this emerging requirement, the NVESD Modeling and Simulation division has been tasked to
compare target identification performance between ground-to-ground and air-to-ground platforms for both IR and visible
spectra for a set of wheeled utility vehicles. To measure performance, several forced choice experiments were designed
and administered and the results analyzed. This paper describes these experiments and reports the results as well as the
NVTherm model calibration factors derived for the infrared imagery.
Real MWIR Persistent Surveillance (PS) data was taken with a single human walking from a known point to different tents in the PS sensor field of view. The spatial resolution (ground sample distance) and revisit rate was varied from 0.5 to 2 meters and 1/8th to 4 Hz, respectively. A perception experiment was conducted where the observer was tasked to track the human to the terminal (end of route) tent. The probability of track is provided as a function of ground sample distance and revisit rate. These results can help determine PS design requirements for tracking and back-tracking humans on the ground. This paper begins with a summary of two previous simulation experiments: one for human tracking and one for vehicle tracking.