An error analysis for IR imaging radiometers is summarized. In all, a total of 14 different sources of error are considered in the determination of a total cumulative error. Since the accuracy of a radiometer is the best judge of its usefulness, such an error analysis can be used in the design of an imaging radiometric system. Trade-off studies can be performed in such a way as to minimize the total cumulative error of the system. Examples of how minimizing the total error impacts the design of a number of radiometer sub-systems are presented.
Degradation of laser beam quality by special effects is considered to be of increasing importance for investigations concerning combat field communications. An 8-bit transient memory device has been developed which allows storing of a series of up to 15 laser beam intensity profiles using a CCD linear array of 1754 diodes with a spatial resolution of 10 micrometers . The shortest time interval between consecutive profiles amounts to 2 ms. Data reduction of the measured profiles can be achieved by best fit of a Gaussian normal distribution with four parameters representing bias level, peak amplitude, width (FWHM), and peak position. This procedure was applied to helium-neon-laser radiation after transmission through the gas blast expanding from a powder gun. Two different experimental arrangements have been realized so far. The first one comprises a large vessel which limits the expansion of the combustion products from a 20 mm-bore gun, and the laser beam traverses the barrel axis in front of the muzzle. The second set-up allows free gas blast expansion from a 40 mm-bore gun, the laser beam being adjusted parallel to the barrel axis. For both cases, the time behavior of beam extinction, broadening and wandering is reported. Absorption and scattering of radiation by shock waves, turbulent structures and aerosols in the exhaust cause considerable temporary alterations: peak intensity attenuation down to 0.1 beam width reaching up to twice its initial value, and beam deflection up to 2 mrad.
An experimental tool and set of experiments for estimating the short and long exposure atmospheric modulation transfer function (AMTF) in the infrared are described. Measurements are presented using a new technique of simultaneously comparing close-up infrared images with optically matched distant images to isolate distortions due to atmospheric turbulence. A unique large area (1.8 m X 1.8 m) uniform temperature blackbody with spatial bar patterns is used as a target for the experiments. The AMTF is measured as a function of changing atmospheric conditions with thermally induced turbulence assessed in terms of changes in the measure of the AMTF. Additionally, imagery obtained during quiescent atmospheric conditions is used to characterize the system transfer function of the near-field imager and enhance the near-field image spatial resolution.
Observations in the infrared wavelength band between 8 and 12 micrometers of sea backgrounds have been recorded with a CCIR compatible imager for a large number of sea states (0 - 6). Recordings took place in coastal areas as well as on open seas. The behavior of clutter in the infrared data was analyzed in space and in time. Clutter values are analyzed to give estimates for the physical appearance of the sea, such as wave structure. Elevation profiles are evaluated for sea state characteristics and show that the variation of the RMS with elevation (- dRMS/d(epsilon) ) decreases with increasing sea state number. Infrared sequences are used to derive periods in the RMS clutter values and to derive time constants of about 1 second for the images in the sequence to become uncorrelated. A constant azimuthal wave velocity is found from the radiance images. Sunglint images were recorded with FEL-TNO's Multi Path Transmissometer Radiometer (MPTR) simultaneously in six wavelength bands at 0.6, 0.8, 1.3, 2.1, 4.0, and 10.0 micrometers in coastal areas. The presented images are very similar in all six bands with a pattern width of 8.8 degree(s). The elevation averaged profiles are well fitted by a Gaussian pattern. Characteristic hotspot duration times were estimated to be 0.15 to 0.20 seconds. The spectral correlation is investigated and found to be present only on a global scale.
In support of a continuing program of evaluation and experimental validation of FLIR Tactical Decision Aid performance codes, a series of measurements has been made of ship radiance temperature distributions together with sea and sky backgrounds. The measurements have been made at ranges from one quarter to one mile off the coastline in Monterey Bay, using a land- mounted Agema 780 dual band Thermovision radiometric sensor, with computer data acquisition and storage. The target ship was the research vessel Point Sur carrying a full suite of meteorological instruments and an array of thermal sensors for ship surface temperature distribution. Rawinsonde balloons were released to obtain vertical temperature and humidity profiles for path correction using LOWTRAN. The normal skin emissivity was measured in a separate experiment. The current data band consists of 898 stored radiometric frames containing ship images including starboard, port, bow and stern aspects, together with sea and sky background frames with varied zenith angle. These files are available for false color display and analysis in a variety of formats.
This paper proposes a procedure for the numerical evaluation of the efficiency of countermeasures in the thermal infrared. This procedure consists of three phases. In the first phase, the characteristics of different thermal camouflage materials are tested on a lab-scale. These tests comprise measurements of the attenuation of the incident infrared energy and/or of the thermal emissivity factor. With respect to the attenuation measurement, a calibrated infrared sensor is used to determine the radiation patterns of an object. The comparison of these patterns before and after the application of a camouflage system gives an absolute measure of its attenuation. The result of this measurement is important since the attenuation is closely related to the contrast between the camouflaged object and its background and thus to the probability of detection. Contrast, however, is not the only important feature for the detection of an object in a thermal image. That is why in a second phase the countermeasures under evaluation are tested in a real environment. During this phase, a numerical value is given to the efficiency of the considered camouflage in the thermal infrared, using features selected from those which are known to be important for human vision. These include, besides contrast, other features such as correlation and texture. The third and final phase aims at a verification and a validation of the test results. Indeed, it is of a crucial importance to find a link between the performances obtained in the field and the characteristics measured on a lab- scale. It is also necessary to verify that a good correlation exists between the efficiency as determined by human observers or by numerical evaluation. This is now taking place using a database of thermal images taken with a GEC Avionics TICM II (8 to 12 micrometers ). Those images are then presented to human observers and to the machine in a project called Psychotest.
Target acquisition performance of human observers is quantified in experiments with real target imagery. It is possible to describe the results with two basic laws of human visual perception. It is shown that recognition of military targets is equivalent to the detection of a circular disc of a certain size and contrast. The size of this disc is characteristic of the difficulty of the observation task. Once the critical disc parameters for a given task are known, the target recognition distance or probability can be calculated. This approach has been termed the 'equivalent disc approach' and has resulted in a target acquisition model for the visible spectrum called PHIND (Photons In Noise Detectability). It will be shown that this approach can in principle be extended to the thermal infrared spectrum. The 'disc detection' concept incorporates the effects of resolution, low contrast, low luminance, and noise.
Existing software packages for target/background signature analysis generate radiometrically accurate results, but may be difficult to use because of the complexity involved in coordinating model inputs and in 'mating' intermediate model outputs into a final result. In addition, tools used to display and analyze model results are not always available on the model's host computer. Horizons Technology, Inc. (HTI), has undertaken the development of integrated workstations for the analysis of earth background and target signatures in the visible and infrared wavelengths. The first of these products, available now, is the Infrared Atmospherics and Signatures Prediction Model (IASPM). IASPM is hosted on 80386 and 80486 PC/AT desktop computers, and incorporates three widely used electro-optical simulation codes: LOWTRAN (atmospheric propagation), SPIRITS (target signature), and EOSSIM (sensor performance). Also included are image display and analysis utilities. IASPM is controlled by an interactive menu-driven graphical user interface which assists the analyst in selecting, comparing, and then analyzing meaningful target/background image scenarios. In addition, IASPM can also be used to display, analyze, and extend measured imagery into new and unmeasured cases.
This paper proposes a new method for evaluating the efficiency of countermeasures in the thermal infrared. Thermal camouflage materials are developed to attenuate the incident infrared energy and to change the emissivity factor. These effects have a great influence on the detectability of the object in its background. To give a numerical value to the efficiency of the considered camouflage, a software environment has been developed which delineates the target from its background and makes a comparison using features selected from those which are known to be important in human vision. To find the edge of the target, a semi-automatic approach is introduced based on a breadth first search in a directed graph. The edge finding problem is translated in a search for a cyclic path through the graph, using an initial estimate. This method gives a very reliable contour and makes it possible to give a realistic evaluation of the blending of the object in its background.
This paper illustrates the use of fractal geometry and fractal metrics for analysis and characterization of natural textures and clutter in IR images in the wavelength band of 2-5 micrometers . In addition to the local fractal dimension, the lacunarity of textures is also briefly investigated. The addition of lacunarity significantly improves the pattern classification performance and is an important part of a complete fractal description of natural textures. A new measurement technique, based on the statistics of a space-filling curve, is presented. Specifically, a space-filling scan of an image texture is used to estimate the fractal dimension of the corresponding intensity surface. This unique one-dimensional representation is also used for measuring local texture features such as granularity and lacunarity.
Numerical estimates of correlation lengths occurring in natural and synthetically generated IR two-dimensional digital scenes, both below (BTH) and above (ATH) the horizon, are presented. Diagnostics and generation of synthetic scenes for arbitrary correlation length to pixel ratio are analyzed for their dependence on scene dimension. New equations, useful for estimating scene correlation lengths from the large and small correlation length components of a single cloud top altitude multiscale BTH broken cloud scene, are given. These equations are valid for a minus two exponent power-law Power Spectral Density (PSD) corresponding to a one-dimensional cut in the scene data. The physical components are defined by the intrinsic cloud top statistics, the edge statistics due to the jump in mean radiance from cloud top to ground, and the intrinsic ground statistics. Other power laws as well as integral- and micro- scales are considered. Footprint effects are included. The scene correlation length estimate is expected to be useful as a prediction tool and as a diagnostic where component radiance standard deviations and correlation lengths are known. The methodology is based on PSD and Auto Correlation Function (ACF) measurements. Implementation of scene analysis by statistical methods at Lockheed Palo Alto Research Laboratory (LPARL) is highly interactive and graphically oriented. A complete statistical analysis set includes hard copy scene imagery; radiance profiles; histogram; orthogonal, ensemble-averaged, one-dimensional PSDs and ACFs; and the estimation of correlation lengths. A typical set run time for a 512 by 512 pixel scene is ten minutes on a VAX Station 3200. The statistics of any rectangular local scene of arbitrary 2n dimension can be obtained also.
The power or variance spectral properties of the ocean surface, in the infrared, have been investigated by a number of workers in the past. This paper summarizes the results of an analysis which extends previous results, obtained at spatial resolutions measured in tens of meters, to centimeter scale spatial wavelengths. The data were obtained with a common module HgCdTe FLIR radiometer with high spatial resolution (< 0.25 milliradians) and moderately high thermal sensitivity (<< 0.1 degree(s) C). The spatial resolution of the FLIR and large number of available scan lines per scene, for ensemble spectral averaging, permit computation of power spectra at centimeter scales with extremely low estimation variance. Prior to computation of the power spectra, the data were radiometrically corrected for instrument effects, atmospheric path radiance and transmission losses, and the OTF of the sensor. Both one- and two-dimensional power spectra were computed and parametrically evaluated by spectral factorization and characterized via linear predictive coding (LPC) to multipole autoregressive (AR) linear system models. This type of spectral characterization facilitates (1) correlation to dominant spectral components (i.e., system poles) with specific wavenumber regimes and ocean surface models; (2) simple characterization of large numbers of ocean surface spectra with a few parameters; and (3) generation of two-dimensional radiance map simulations by inversion of the linear system model. The results of this analysis are presented along with conclusions regarding agreement with previous studies, utility of the technique, and model validity.
UFLR is one of an evolving set of FLIR performance prediction programs used at sea to predict the ranges for detection, classification, and identification of target ships. One aid in the validation of such a program is a sensitivity analysis of the program parameters. Sensitivity analyses indicate that the ranges for detection, classification, and identification are strongly sensitive to target areas, target-to-background, temperature difference and atmospheric conditions such as windspeed, visibility, humidity, and vertical temperature, humidity, and pressure profiles. One uncontrollable parameter is the noncontiguity in space and time of the radiosonde and FLIR measurements. This problem was investigated by dithering the radiosonde data, input to UFLR, with a random number generator to generate variations in the pressure, temperature, and relative humidity in the atmospheric profile. Results indicate that noncontiguity of measurements can lead to 50% error in range predictions.
Time and spatial variations in atmospheric aerosol plumes can be factors in the effectiveness of electro-optical (EO) systems. Of particular interest are obscuration statistics including the frequency and duration of optically thick and optically thin cloud regions that intermittently affect sensor operation. Related to this problem is an increasing use of modern visualization techniques in simulations and computer graphics to portray realistic looking clouds. It is obviously important that such visualizations accurately model obscuration statistics if they are to be used in EO design and evaluation. Most plume models in use are either based on well-established, physical descriptions of mean aerosol transport and diffusion, buoyancy and radiative transfer, or they are based on modern artistic computer image texturing to provide variations in cloud appearance. The former, traditional methods visualize aerosol plumes as smooth distributions resulting from time-averaged contributions of turbulence to the mean flow. The latter computer graphics techniques visualize instantaneous spatial and time variations. However, the graphics techniques are usually devoid of much physics, using only a subjective criteria of 'looking realistic' as a validation basis. This paper begins a process of reconciling the two approaches. A method for texturing the mean plume concentrations is examined that has much in common with fractal- based texturing in computer graphics. The method also includes time-dependent evolution of eddy phases within the plume. Results of actual transmittance measurements of smoke and dust plumes are presented that show an apparent measured change in fractal dimension or Hurst parameter with increasing distance or time separations. This behavior is discussed in terms of contributions from different eddy sizes to the path integrated concentration. It is then shown that the fixed fractal dimension of the concentration texturing scheme can be reconciled with the changing dimension in the transmittance data if one is careful to modify the texture model to account for the fact that we see real plumes by the transmittance of the background radiance through the cloud and from the path radiance produced by scattering or emission from the cloud itself. Additional efforts to analyze images to include the scattering and emission contributions to cloud statistics are briefly discussed.
This paper describes the development of a 3-D statistically non-stationary earthlimb stochastic structure representation and its application to the implementation of earthlimb infrared (IR) structured scenes. The stochastic structure overlay is constructed from a 2-D diagonal cut through a 3-D matrix of correlated Gaussian deviates: each plane of the matrix represents a statistically constant representation of the two-dimensional correlation lengths at a given altitude above the earth. Each matrix plane is generated by using successive 2-D FFT routines that have empirical values of vertical and horizontal correlation lengths as input. The total deviate variance versus altitude is then scaled from empirical measurements of fluctuations in atmospheric density, temperature, and/or emissivity. Scene correlation lengths are validated by performing diagnostics on the overlays. These structures generators have been used both as perturbations on input atmospheric data to earthlimb IR radiance codes and as high resolution overlays to earthlimb IR mean radiance 2-D scenes. Examples of modeling results are presented and discussed.
The problem of calculating an effective contrast for diffused targets in IR scenery is addressed. The effect of local variations of intensity in the vicinity of the target is addressed by considering the distribution of the intensity of the edges. The effect of global clutter is discussed from the vantage point of selecting the best model for background IR scenery. Generalized contrast measures are described and applied to real scenes.
A model for generating synthetic images of clouds, in the IR and the visible bands, has been developed in order to obtain sky background images for various atmospheric conditions. The model is based on fractals, radiometry, ray-tracing techniques, and atmospheric propagation. The cloud reliefs are generated (using a fractal-based technique) in multiple window planes defined by the cloud statistics (ceiling, thickness, density, etc.). After that, for each cloud, each radiance component (self-emitted radiance, reflected sun and ground or sea radiation, transmitted radiance, etc.) is calculated separately and summed with the other radiance components. The final image is calculated by propagating each cloud plane through the atmosphere (using interpolation of LOWTRAN data) and through the successive cloud planes up to the observer. Not only is an image of the sky generated, but the known 3-D scene makes it possible to introduce objects at any location in the scene. The model has been programmed and tested and the results show the validity of this approach.
A method for rapidly generating target signatures with limited user interaction and limited computing resources is presented. Simple parameterized generic models are used to represent classes of targets. Within the limits of the allowable parametric variation, as many steps as possible in determining the signature are precomputed. Examples are given of generic models of bridges and dams.
The U.S. Army Cold Regions Research and Engineering Laboratory (CRREL) is administering a multi-year, multi-agency infrared background data and modeling program entitled Smart Weapons Operability Enhancement (SWOE). This paper describes the progress made to date in model development and integration, under the direction of the Geophysics Directorate of the Air Force Phillips Laboratory. Other aspects of the program, not described here, include measurements and database development. A 1-D thermal model for natural backgrounds has been developed that covers a full range of background types. The range of types includes vegetated and nonvegetated surfaces, winter conditions, porous materials, and the presence of soil moisture. The model is also being used as a test bed for the development of a full 3-D model for natural backgrounds. The temperatures computed with this model are designed to be input to the radiometric models. Two models have been developed to compute the infrared radiances of natural background scenes, with both spectral and spatial capability. The first model computes radiances from terrain or water, while the second is used for modeling of discrete 3-D objects, such as trees, buildings, and vehicles. Both radiometric models includes thermal emission and reflections of the sun and sky. Atmospheric transmission and radiance are included. The terrain and objects may have spatially-varying temperatures and surface coatings. Emittances and reflectances are spectral and directional/bi- directional. Radiances computed with these models are translated to a Computer Image Generator (CIG) run by the U.S. Army Engineering Topographic Laboratory (ETL) for image rendering.
Real battlefields are very messy places which may contain burning vehicles, vegetation, or buildings and with atmospheres containing dust and smoke clouds. Realistic scene simulation for system evaluation must therefore also contain these same elements. A model has been developed to generate realistic images of fire plumes from burning military vehicles. Model output includes transmittance through the plume and radiance from the fire and hot effluents for a selected image matrix. The simulated image varies with time based on simulated wind fluctuations. The plume centerline fluctuates about a curve in the wind-aligned plane. The effluent gas and aerosol concentrations and the plume temperature vary throughout the plume. The variation is a fractional change from a Gaussian distribution and is based on measurements of temperature and short-path transmission in fire plumes. The model can be used to generate both visible and infrared fire plume images. The images can be combined with background images by multiplying elements of the background image by the transmittance matrix elements and then adding the elements of the radiance matrix. The order in which these operations are performed is as critical as the order of steps taken in data analysis. The image matrix elements can be tailored to correspond to pixels from an imager by choosing appropriate values for the waveband, the distance from the imager to the fire, and the angular resolution per pixel. Spatial averaging is used to obtain realistic results when the fire is small compared with pixel size.
Both the theoretical and the experimental problems of backgrounds are examined. The authors show why the current definitions of correlation length should be used with care, with attention paid to the intensity histogram of a scene. Different effects of the sub-pixel features in a measured scene on the clutter for imaging and scanning systems are also explained. The two- dimensional polarization of a scene is measured and found to compare favorably with the theoretical predictions. Finally, the authors show how to simulate backgrounds whose power spectrum is given, together with constraints on the image proper. This is achieved by iteratively transforming between the image plane and its Fourier conjugate, while imposing the appropriate constraints in both planes.
That smart munitions false alarms result from randomly spaced fixed position discrete physical objects within the background is the standard assumption for false target treatment in several smart munitions performance and effectiveness models. This premise is tested in a simulation study which identifies specific terrain features causing a hypothetical thermal infrared smart munitions sensor to false alarm. The sensor configuration and the target detection algorithms are input to the Waterways Experiment Station (WES) smart munitions sensor model which is 'flown' over high resolution calibrated thermal imagery of several test sites for which there is ground truth. Target detection decisions in these target-free backgrounds are mapped into large-scale color aerial photographs taken simultaneously with the thermal imagery. False alarm-causing terrain features are identified from the aerial photographs and are characterized as a function of test site, time of day, and target acquisition algorithm used. Several important characteristics of thermal false alarms are formulated.
The extended radiosity method or zonal method is used to construct realistic synthetic images (in the visible and IR) containing radiatively participating media such as smoke, fog, and clouds. Computational methods are discussed as well as the rendering of various scenes using computer graphics methods. The extended radiosity equations and an efficient algorithm to compute the radiosties and radiances in a homogeneous participating medium over an inhomogeneous flat surface are presented.
Technology Service Corporation (TSC) improved the image generator portion of an IR scene simulator that is used for dynamic, real-time, hardware-in-the-loop testing of an IR sensor system. This paper describes the IR simulator prior to its improvement, the goals that were set for the image generator upgrade, and the resultant improved IR simulator, including its new features (e.g., antialiasing, smooth shading, and texture). The steps involved in computing an IR scene are also discussed, as are the software models used to generate the IR scene and an alternative test equipment configuration that was devised for the simulator.
A Thermal Picture Synthesis (TPS) system for displaying real-time IR scenes, for application in hardware-in-the-loop seeker head evaluation, is described. This device uses improved fabrication and addressing technologies. In operation, both the TPS and missile seeker head are mounted to a flight table and synthetically generated IR imagery is presented to the missile from the TPS. The missile 'flight' and the IR scene are controlled and updated in real time by computer, allowing closed-loop evaluation of seekers under laboratory conditions.
The sea surface relief can be generated by filtering white noise using a filter derived from the known power spectrum of the sea waves. The image of the sea can then be calculated using ray-tracing techniques. In order to take into account the dynamic aspect of the sea, we have developed simple phase filters which are applied on the sea surface spectrum. This allows the simulation of a dynamic sea. The time-dependent parameters that are taken into account are the dynamic sea surface (moving waves), the moving observer, and the moving camera.
Commercial frame grabber technology in IBM PC compatible computers has been adapted to allow direct digital input of infrared search and track data at rates of up to 10 mbytes per second, permitting real-time processing and display of false color thermal images. Examples of single frame displays and background suppression by frame subtraction are shown. On- board processed Fourier spectra and Fourier power spectra of selected frame lines are shown. Curve-fitted representations are compared for clear air, cloud and land clutter, and a commercial aircraft at close range.
Forced heat convection in a wind field is considered, and the infrared radiance statistics of natural terrain are analyzed by a heat balance equation. A first-order approximation is used for the terrain radiance term in the equation. The results show that forced heat convection can change the statistic distribution of the radiance in a thermal image of natural terrain. With the increase in wind speed, the most probable and average temperature will trend to the steady air temperature over the terrain, and the thermal image contrast will decrease. The saturation phenomena of the forced convection effect are also mentioned.
Smart electro-optical systems of the future will need to be adaptive and robust to function in different environments. In 1989 the authors reported how atmospheric losses in contrast, resolution, edge detail, and signal to noise adversely affect image-based classification using linear matched filters and how the atmosphere alters features such as gray-level moments. They also showed that the performance changes with atmospheric path radiance and transmittance are predictable, however, and that some effects can be mitigated automatically by including the atmosphere as a separate training class. This paper extends that analysis to atmospheric effects on pattern recognition by neural network classifiers. The neural net pattern recognition methods considered here are single- and multi-layer perceptron networks trained with back-propagation. Image classifier performance under different atmospheric propagation conditions is shown to be easily predicted for simple single-layer neural nets. This leads to a specific training strategy to minimize the impact of propagation losses by including the atmosphere as a separate training class. This same strategy also improves the performance of multi-layer neural networks. Examples are given of classification of a vehicle partly obscured by highly scattering white smoke and highly absorptive black smoke. Other methods are being investigated that affect the performance and training convergence properties of neural net pattern recognition in atmospheres.
Synthetic scene generation models depend on the repeated evaluation or iteration of often computationally expensive functions to create 'texture' or structure in a modeled natural scene. The computational burden of 'texture' function evaluation is dependent upon the spatial frequency response characteristics of the optical system through which the modeled scene is to be 'propagated.' All real 'cameras' utilize detector elements of finite size to transduce the resulting MXN pixels of the image. Two general situations arise in modeling the effects of cameras (sensors) on the imaged scene. The first is the 'oversampled' case where the Nyquist frequency of the detector spacing exceeds the cutoff of the optics transfer (aperture) function. The 'undersampled' case is when such Nyquist frequency is below the aperture cutoff spatial frequency, which results in aliasing. The application of brute force in the oversampled case results in MXN scene 'texture' function evaluations, while for the undersampled case, (mM)X(nN) (m, n > 1) points are required. Several computationally efficient methods significantly reduce the stated brute force computational burden for these two cases. This paper discusses three such methods. For the oversampled case, a correct and efficient method is 2D sample 'interpolation,' in the multirate digital signal processing sense. This method expressly avoids signal aliasing caused by simpler but inappropriate bilinear interpolation of the sparser set of (mM)+(nN) (m, n fractions < 1) scene samples onto the MXN imaged scene. The second and third techniques discussed are applicable to the undersampled case. Each relies upon MXN scene 'texture' function evaluations. The second technique extrapolates the frequency spectrum of the MXN grid with a synthetic spectrum beyond Nyquist which follows the 1/f(beta ) decrease in power typical of natural (fractal) textures. The third technique, 'fractal interpolation,' operates in the spatial domain where spatial detail, generated from the computed ('texture' function) MXN grid onto a larger (nM)X(nN) grid, is synthesized at the same local fractal dimension as that of the MXN 'undersampled' data. In both cases, the synthesized frequencies above the camera Nyquist are 'folded back' in the spectral domain to approximate the aliasing of spatial frequencies into the transduced image.
This paper provides an overview of the critical IR projector requirements and specifications for hardware-in-the-loop (HWIL) simulations of imaging IR systems and reviews the most prominent technologies associated with IR scene projection. Each method is briefly discussed in terms of physical operation with all revelant advantages and disadvantages highlighted. A comparison of the different categories of projection devices for applicability to HWIL simulation is provided.
A technique is presented for measuring atmospheric effects on image metrics by comparing infrared images simultaneously collected from positions close to and far away from a target. An image acquired close to the target provides a measure of the radiance inherent to the target and its background, while an image acquired far from the target provides a measure of the radiance after propagation through the atmosphere. Therefore, changes in radiance can be separated into those due to the change in the inherent radiance and those due to the propagation of the inherent radiance through the atmosphere. A 'complexity metric' is used to quantify the effects of environmental and atmospheric changes on target-to-background contrast. Examples of the effects of cloud cover, wind speed, dust clouds and optical turbulence on the complexity metric are presented.
In the last year, several moderately priced real-time image generators offering textured rendering have appeared on the market. These image generators provide an opportunity for more realistic IR image simulation by significantly increasing the level of detail of the simulated image with no degradation in real-time performance. As part of the development of a new imaging IR simulator, the authors have incorporated the texturing feature of their image generator with their thermal model to generate thermally accurate textured IR backgrounds. The textured thermal model uses a two-dimensional Markov process to generate a background temperature fluctuation map; this model has been proposed by several studies of thermal IR terrain measurements. The textured surfaces are processed through the radiance and sensor models to generate the simulated images.
Analysis and simulation of smart munitions requires imagery for the munition's sensor to view. The imagery is usually infrared and depicts a target embedded in a background. Mathematical models of such imagery are useful to munitions researchers. A mathematical model can synthesize a test scenario at a cost much less than that of actual data acquisition. To date, most research has focused on the modeling targets. It is essential, however, to test a munition's target acquisition algorithms on images containing targets superimposed on a wide variety of backgrounds. Consequently, there is a need for accurate models of infrared backgrounds. Useful models are difficult to create because of the complexity and diversity of imagery viewed by smart munition sensors. A model of IR backgrounds is presented that will, given a textured image, generate another image. The synthetic image, although distinctly different from the original, has the general visual characteristics and the first and second-order statistics of the original image. In effect, the synthetic image looks like a continuation of the original scene, as if another picture of the scene were taken adjacent to the original. The model is an FIR kernel convolved with an excitation function, noise added to the result, and followed by histogram modification. The paper describes a procedure for deriving the correct FIR kernel using a signal enhancement algorithm, and reports a demonstration of the model in which it is used to mimic several diverse textured images.
The Battlefield Environment Weapon System Simulation (BEWSS) includes battlefield obscurant transport and diffusion models and adverse weather models which can predict transmission over a specified range in the presence of specified meteorological (MET) conditions, obscurant conditions, and scenario geometry. These models are used to generate transmission maps applicable for a particular sensor boresight, resolution, field-of-view (FOV), and spectral bandwidth. In addition, a three-dimensional IR target/background scene is modeled, and perspective views of the target are generated from any viewing geometry. By combining the BEWSS obscurant models with the target/background model, realistic synthetic imagery of an obscured tactical target is generated for use in analyzing night sights and IR seeker/trackers. The unique feature of these images is that they are correlatable to a particular set of assumptions about the target/background temperature, the MET conditions, the obscurant conditions and the IR sensor characteristics. The imagery is displayed on an IRIS 4D-GT Silicon Graphics workstation, and the high-resolution thermal maps are stored for future analysis. The imagery has been animated and stored in video format for use in evaluating seeker/tracker terminal homing performance. This paper presents the BEWSS models employed in developing the computational algorithm, and displays examples of the generated imagery.
A current effort at the U.S. Army Missile Command (MICOM) is the development of an infrared (IR) scene projector based on spatial light modulator (SLM) technology. The three phase effort is centered around the Texas Instrument deformable mirror device (DMD) , which is a two—dimensional array of mirrors with independently controlled deflect ions.
The infrared exitance of steel plates with several emissivities are modeled using PRISM 3.0 and LOWTRAN7 under sky backgrounds representative of Middle East desert conditions in the summer. LOWTRAN7 is used to calculate the downward thermal radiance of a desert haze atmosphere with multiple scattering. PRISM 3.0 incorporates the results from LOWTRAN7 into annular rings that represent the temperature gradiant of the sky dome and predicts the apparent temperature of the plates in the 8 to 14 micron band. This study is part of a preliminary look at the issue of passive low observable technology for application to ground vehicles and an illustration of state-of-the-art computer-based background modeling and thermal simulation.
An investigation has been undertaken which utilizes nonintrusive optical interferometric techniques to visualize the turbulent structure found in a high-velocity flow field and thereby characterize the resulting optical distortion. Experiments were conducted on a 7.68 mm by 7.68 mm cross section of a high velocity, dual gas, mixing/shear layer, and the preliminary results are presented. The experimental apparatus consisted of a dual beam Mach-Zehnder interferometer with a customized high-speed CCD camera data acquisition system. A series of time varying images of the gas flow were captured and digitized with the interferometer configured in both a finite and an infinite fringe mode. By correlating the initial tare run wavefront (gas off condition) to any subsequent distorted wavefront (gas on condition), the turbulent flow field structure and the relative phase shift across the test region was analyzed. Both classical and nonclassical approaches were taken in analyzing the interferometric data to obtain an understanding of the high velocity flow field. In addition, the experimental results were compared to theoretical predictions for RMS wavefront distortion.
A unique experimental apparatus has been designed and constructed to characterize aero-optical distortions related to
the turbulent flow conditions experienced by a windowed hypersonic vehicle. Using this apparatus, a series of imaging tests
was conducted with a classical mixing/shear layer traveling at approximately 600 mIs. The experimental setup consisted of a
collimated 0.84 mm laser diode point source that was passed through the flow field and imaged onto a CCD array. During a
one second stable flow period, 92.5 frames of images were collected. Several runs were made with the lasr diode operating in
both continuous and pulsed (40is duration) modes. These images were used to investigate several effects such as, image blur,
jitter, and strehl loss. For long integration periods, the image experienced an average image blur circle size increase of
approximately 24 times from the "wind-off" case. The pulsed runs showed an increase in jitter of approximately 36.4 j.trad.
In addition, during continuous runs, a strehl ratio of approximately 0.0026 was observed. These and other preliminary results
correlated well with theoretical predictions.
This paper documents mid-infrared high spatial resolution terrain background measurements taken at the White Sands Missile Range, New Mexico. The results include the mean, minimum, and maximum apparent radiance signatures as a function of the time of day from three different locations. Source radiance signatures that have been corrected for path radiance and atmospheric attenuation, and contrast radiance within the analyzed fields of view, are also presented.