Empirical data are presented from a larger-scale, narrow field-of-regard visual search experiment. The target scene characteristics of images taken during the 1995 DISSTAF field observer trials were digitally manipulated to mimic field conditions such as haze and darkness, and vehicle treatments such as camouflage nets and target contrast suppression. The experimental factors induced by the image manipulations all had a significant effect on search performance (hit rate). Data are consistent with psychophysical accounts of visual search. The data were consistent with earlier data from a cued detection task using the same stimulus material. The hit rate in the cued detection experiment explained 85% of the variance in the hit rate form the search experiment. A computational model of human visual discrimination, built to explain cued detection performance, was shown to fit the search data better than would have been expected based on the product independent errors.
The purpose of this work is to provide a model for the average time to detection for observers searching for targets in photo-realistic images of cluttered scenes. The current work proposes to extend previous results of modeling time to detection that used a simple decaying fixation memory. While the aforementioned results were encouraging in showing a strong effect of fixation memory, there were also discrepancies. The main discrepancy was the tendency of immediate refixation, which was not accounted for at all by the original model. The present paper describes how the original fixation memory model is extended using a shunting neural network. Shunting neural networks are neurally plausible mechanisms for modeling various brain functions. Furthermore, this shunting neural network can then be extended in a simple manner to incorporate effects of spatial relationships, which were completely ignored in the original model. The model described is testable on experimental data, and is being calibrated using both analytical and experimental methods.
Detecting targets using Infra-Red (IR) sensor is a common field of investigation. The main problematic issue is that the target appears in front of a background that causes false alarms. Increasing the detection threshold decreases the false alarms but also decreases the probability of detection. Knowing the relation between the background's common correlation distance and the target's displacement between sequential samplings is an a priori information that may be used to improve the detection abilities. This a priori information may be estimated from the scene. In this paper we derive a model relating the movement of the target with the statistics of the background so that lower probability of false alarm may be obtained for similar probability of detection or on the other hand higher detection probability for equal probability of false alarm. The obtained improvement is due to the fact that instead of placing a global threshold chosen according to the total spatial and temporal variance of the background one may use a threshold which is adapted to the relation between the spatial statistics of the background and target's motion characteristics. The paper presents a complete mathematical derivation of the model as well as computer simulations that clearly demonstrate the hypothesis of the paper.
In our work, we will focus on the atmospheric effects on the acquisition of a target and on image restoration by filtering the atmospheric effects such as: blur caused by absorption and scattering from aerosols and distortion caused by turbulence that changes the wave front angle, moves the image on the image plane, and blurs it. The restoration method using an atmospheric wiener filter is intended to correct the atmospheric effects. This filter is based on the fact that the modulation transmission function (MTF) of the turbulence is composed of a mean value and a random component while the aerosol MTF changes slowly. The project goal is the realization of the atmospheric wiener filter, finding the limitations (in terms of blur to noise ratio) of the filter through psycho-physical experiments and statistical analysis of the results. We found that the atmospheric wiener filter can improve target acquisition probability at low noise levels. As the noise increases the improvement becomes more limited because the restoration increases the noise level in the image.
It is well known that background characteristics have an impact on target signature characteristics. There are many types of backgrounds that are relevant for military application purposes; e.g. wood, grass, urban, or water areas. Current algorithms for automatic target detection and recognition (ATR) usually do not distinguish between these types of background. At most they have some sort of adaptive behavior. An important first step for our approaches is the automatic geo-coding of the images. An accurate geo-reference is necessary for using a GIS to define Regions of Expectations (ROE-i.e. image background regions with geographical semantics and known signature characteristics) in the image and for fusing the (multiple) sensor data. These ROEs could be road surfaces, forest areas or forest edge areas, water areas, and others. The knowledge about the background characteristics allows the development of a method base of dedicated algorithms. According to the sensor and the defined ROEs the most suitable algorithms can be selected form the method base and applied during operation. The detection and recognition results of the various algorithms can be fused due to the registered sensor data.
The target detection can be carried out with a statistical matched filter. The construction of the matched filter needs the information on the background clutter statistics as well as on the shape of the target. For the computational simplicity, a filter bank consisted with pre-designed matched filters can be used for adaptive filtering. The classification of background clutter must be preceded to compose a filter bank and need pre-collection of samples of background clutter. In land-based IRST, there are too many different types of background clutter to hold a filter bank tuned to them. To overcome this difficulty, we propose a new classification method which use GIS (Geographic Information Science) -assisted background registration. We discern different clutter regions in the initial image frame using a feature vector composed of the vertical and the horizontal autocorrelation and build filters tuned to each class. In the successive frames, we classify each region of different clutter from contour image obtained by projecting the GIS data and by registering to the previous image. Each classified region of image is then filtered using a pre-designed matched filter in the previous image frame. We only have to construct a filter for newly appeared region. The proposed algorithm has been tested with synthetic image frames, and we observe that our method has advantages of reducing computational load and false detection at edges.
A robust computer based camouflage assessment approach was presented at the AeroSense 2000 conference. Based on experiments with human observers a separability measure was developed. The method was classifier based and best results could be obtained using the C4.5 classifier as a separability measure. Using this method makes camouflage assessment transparent and deterministic presuming correctly specified regions of interest. This paper describes our effort to overcome the drawback of the need of user input at such a critical step within the method. We used unsupervised learning along with an optimizing method to derive information about the number of clusters and other performance measurements. All these measurements coming from the optimization step were adopted to camouflage assessment.
Developments in the area of signature suppression make it progressively more difficult to recognize targets. The emphasis has been on the reduction of distinct features, like hot spots in the infrared band. Thus, in order to obtain a low false alarm rate, threat sensors have to utilize more information, i.e. spatial and spectral properties. The purpose of our work is to develop a general tool for camouflage assessment. The approach proposed in this paper is to apply texture descriptors to quantify the similarity between different parts of an image. In addition, other descriptors are used to distinguish man-made object characteristics. The selection of an appropriate set of features is discussed. The assumption is that an area containing observable targets has different statistics than other areas. Statistical properties along with detected target specific features have to be combined with methods used in data fusion. An experiment with a data set of 44 reference images has been carried out, using a recently developed computer program called Terrtex. High correlation with perception experiments was achieved using only one or two texture features. The importance of a careful selection of background area size is finally discussed.
The Virtual Targets Center (VTC) is a strategic alliance between the Targets Management Office within the US Army Simulation, Training, and Instrumentation Command (STRICOM) and the Systems Simulation and Development Directorate within the US Army Aviation and Missile Command (AMCOM). This center reduces duplication of effort by making DoD owned geometry models available for reutilization and supports the modeling and simulation community by redistributing or creating geometry models in formats applicable to a wide range of simulation activities. In addition to these activites, the VTC is developing methods and tools to enhance existing target models. A new software simulation exists at the VTC to automatically create facet models of camouflage netting by considering the netting as a 2D membrane that balances internal tensional stresses and the external force of gravity by assuming a minimum energy configuration - accurately replicating the draping of real netting. The geometric information of this virtual camouflage netting is exported to a file in a format commonly used for three-dimensional modeling, thereby making it available to workers in signature prediction and visualization.
This paper describes the initial phase of an evaluation study on the performance of PMO, the Paint Map Optimizer, for long wave infrared (LWIR) modeling. In this phase, we will evaluate using PRIMS, the Physically Reasonable Infrared Signature Modeler, to predict the thermal signature of a simplified tank geometry, and then PMO to predict the optimal thermal camouflage pattern from a range of emissivities in a given scenario. Prism is a thermal modeling code that has been used extensively to model thermal signatures of military ground vehicles. PMO was developed by Aerodyne Research to provide a computer-aided design tool for camouflage pattern design and optimization in a given scenario and a given band for the US Army Aviation Technology Directorate, AATD. At the end of this phase, we hope to determine the basic effectiveness of the process and identify areas of improvement if necessary. The geometry was modeled in PRISM. which output the thermal signature for input into PMO. The optimizer was used to predict the thermal camouflage pattern in the 8-12micrometers IR band for a range of emissivities with the geometry in three different locations in the background image.
The potential application of multispectral polarization imaging for detection and recognition requires a good knowledge of the depolarizing behavior of targets. We measured the degree of polarization associated to several targets in a monostatic configuration as a function of wavelength and of the angle of incidence. Depolarization effects depending on the absorption of targets were observed and a phenomenological model based on Kubelka-Munk theory is proposed. It describes the behavior of paints and diffuse materials, taking into account both contributions of surface scattering and volume scattering. Target parameters such as roughness, refractive index, scattering coefficient are taken into account and enable to draw out a predictive model of the depolarizing behavior of targets. We found a good agreement between measurements and our predictions.
All military objects must have basic camouflage that is usually achieved by painting. Patterned camouflage painting hides the object and blends its shape and characteristic features in with its surroundings. Basic camouflage can be complemented by using temporary camouflage such as removable camouflage paints. These paints can be used in seasons and environments where the basic pattern is not appropriate. A research project was begun at the Defence Forces Technical Research Centre (DFTRC) in 1994 in order to formulate an environmentally friendly, removable camouflage paint for military use. The paint should be easily removable when they are applied to previously painted military equipment. However, it should also be resistant to drizzle. The paint should have optical properties similar to those of its surroundings. The surface of the coatings should also be matt to avoid any conspicuous reflection. Finally, it should be possible to apply removable camouflage paints in the field using any painting method. During the project environmentally friendly and non-toxic removable paints were successfully formulated. The colors of removable paints are compromises of average operating environments. The project included numerous laboratory tests in addition to natural and accelerated weathering tests. Several field tests have been carried out. According to the tests, the removable paints are well resistant to drizzle, sufficiently resistant to abrasion, and they can be washed off with water.
The image metrics used in this work, were designed to imitate some of the most important human perceptual cues such as global image complexity (clutter), local target to background distinctness (contrast) and texture. In this paper, we examine different possible clutter metrics and their application to infrared and visible images.
Measurements of IR background variation or clutter are important for determining target detectability. Image sequences of widely varied ground-clutter types were recorded with the Airborne Infrared Imager (AIRI), housed in a wing pod of the Airborne Seeker Test Bed (ASTB) aircraft. Target detection statistics were derived for various backgrounds (ocean, ocean glints, desert, forest, shoreline, and urban). MWIR and LWIR images were processed to determine the minimum point-target contrast temperature detectable in various clutter types. This clutter metric was found to be relatively insensitive to changes in wave- length, season, or spatial scale, but to vary strongly with clutter type. These statistics are used to predict clutter- limited detection ranges for generalized targets in appropriate scenarios. Reduction in detection range from most benign clutter (ocean) to most severe clutter (urban) was found to be 7-9 dB, depending on waveband.
Much work has been performed on defining metrics for evaluating clutter in infrared imagery. The metrics on specific images with targets present has been fairly successful at predicting target acquisition performance for human observers. To be most useful, it would be advisable to have generalized metrics, which don't depend on having the infrared images available or the exact position of the target known. Ideally, one would like to designate the geographical area and the type of target and still be able to estimate the effect of the clutter. In this work, we consider such a possibility. Based on satellite imagery, we calculate the physical characteristics, e.g. the plant coverage, for certain geographical areas. We then relate these characteristics to the infrared metrics that have been calculated for certain selected spots. From these results, we can generalize the relationship between the satellite data and the infrared clutter and determine the clutter content for areas for which full infrared imagery does not exist. Examples will be brought from several areas in Israel with contrasting climate and foliage.
The VV-polarized W-band backscattering behavior of homogeneous ground clutter has been investigated by measuring the radar cross section per unit area of 1/16th scale rough surface terrain in a 1.56 THz compact radar range. An array of scale model ground planes was fabricated with the appropriate roughness to model smooth to rough soil terrain. In addition to studying the backscattering behavior as a function of surface roughness, the dependence on soil moisture content was also characterized by tailoring the dielectric constant of the scale models. Radar imagery of the rough surfaces were acquired in a 1.56THz compact radar range by collecting single frequency backscatter data over a solid angle in both azimuth and elevation. The data were Fourier transformed in both the azimuth and elevation directions to produce two-dimensional imagery. The backscattering coefficient per unit illuminated area ((sigma) 0) was calculated as a function of elevation angle between 5 degree(s) and 85 degree(s). The results of this work have been used in the fabrication of scale model ground planes for collection of W-band radar imagery from scaled threat targets in realistic environments. Backscattering data, including clutter statistics, are compared to W-band clutter data found in the literature.
Natural bodies of water have several advantages as IR calibration targets in remote sensing. Among these are availability, homogeneity, and accurate knowledge of emissivity. A portable, low-cost, floating apparatus is described for calibration of remote IR sensors to within 0.15 C. The apparatus measures the surface and bulk water temperature as well as the wind speed, direction, temperature, and relative humidity. The apparatus collects data automatically and can be deployed for up to 24 hours. The sources of uncertainty, including the effects of skin temperature and waves are discussed. Data from several field campaigns to calibrate IR bands of DOE's Multi-Spectral Thermal Imager are described along with estimates of error.
Scene simulation has proved to be a valuable tool for analysing the images perceived by visible and infrared imaging systems. Accurate scene simulation requires accurate incorporation of the optical properties of all the materials within a scene, with reflectance incorporated with the bidirectional reflectance distribution function (BRDF) and emission incorporated through the directional emissivity or hemispherical directional reflectance (HDR). This paper compares the fit of various parameterised models to experimental BRDF data from a variety of surfaces representing the extremes of material properties found in the environment. One of the main aims is to infer the accuracy and validity of an in-house BRDF model called Mopaf using data representative of different sorts of isotropically reflecting materials. Where appropriate physical and semiempirical models and a novel parameter based BRDF model were compared with Mopaf and with BRDF data from a Surface Optics Corporation SOC-200 instrument. It was concluded that Mopaf might not be reliable for all the angular BRDF data, especially specularly reflecting surfaces or grazing incidence data. Likewise, the other BRDF models investigated tended to be limited to a range of physical conditions such as only diffuse reflection or to a range of surface roughness. It was shown that the proposed new BRDF model was more generally applicable from the visible to infrared wavelengths, over a wide range of reflection angles and for different sorts of surface material.
The infrared (IR) radiation emitted or reflected in an off- normal direction from a smooth surface is partially polarized. This principle can be used for enhanced discrimination of targets from backgrounds in a marine environment. It has been shown that (man-made) targets do not demonstrate a pronounced polarization effect when observed from near normal direction whereas the sea background radiation has a certain degree of polarization in slant observation path. A measurement setup has been constructed for collecting polarized IR imagery. This setup contains a rotating polarization filter that rotates synchronously with the frame sync of the camera. Either a long wave IR (LWIR) or a mid wave IR (MWIR) camera can be mounted behind the rotating polarization filter. The synchronization allows a sequence of images to be taken with a predefined constant angle of rotation between the images. Out of this image sequence three independent Stokes images are constructed, containing the normal intensity part, the vertical/horizontal polarization and the diagonal polarization. Up to 20 full linearly polarized images can be acquired per second. Measurements are taken at the North Sea coast with this setup. The recorded images are analyzed to determine the influence of polarization on the detection of small targets in such an environment. Furthermore differences between polarization contrasts in MWIR are analyzed.
We report examples of the use of a scanning tunable CO2 laser lidar system in the 9-11 micrometers region to construct images of vegetation and rocks at ranges of up to 5 km from the instrument. Range information is combined with horizontal and vertical distances to yield an image with three spatial dimensions simultaneous with the classification of target type. Object classification is made possible by the distinct spectral signatures of both natural and man-made objects. Several multivariate statistical methods are used to illustrate the degree of discrimination possible among the natural variability of objects in both spectral shape and amplitude.
A theoretical model for the edge image waviness effect is developed for the ground-to-ground scenario and validated y use IR imagery data collected at the White Sands Missile Range. It is shown that angle-of-arrival (AA) angular anisoplanatism causes the phenomenon of edge image waviness and that the AA correlation scale, not the isoplanatic angle, characterizes the edge image waviness scale. The latter scale is determined by the angular size of the imager and a normalized turbulence outer scale, and it does not depend on the strength of turbulence along the path. Spherical divergence of the light waves increases AA correlation. A procedure for estimating the atmospheric and camera-noise components of the edge image motion is developed, and implemented. A technique for mitigation of the edge image waviness that relies on averaging the effects of AA anisoplanatism on the image is experimentally validated. The edge waviness is reduced by a factor of 2-3. The time history and temporal power spectrum of the edge motion are obtained. These data confirm that the observed edge motion is caused by turbulence.
A large number of military aircraft are equipped with the AN/AAR-47 missile approach warning system (MAWS). The AN/AAR-47 comprises four ultraviolet (UV) sensors, a computer processor (CP) and a control indicator (CI). In-band photon irradiance from incoming missiles produces sensor outputs that are analyzed by the CP to produce threat declaration. The output signal level produced by each sensor for a given photon irradiance level depends on its sensitivity, expressed in terms of photon irradiance response (PIR). All new sensors are tested in manufacture for a minimum PIR before delivery. However, years of operation in harsh military environment may cause sensitivity degradation that makes a sensor unserviceable. A new instrument, the PIR Meter, developed by the Defence Research Establishment Valcartier (DREV), allows for accurately measuring the PIR of AAR-47 sensors, making it possible to identify those sensors whose sensitivity has dropped below the serviceable level. This paper describes the PIR Meter, its components, its interconnections with installed or uninstalled AAR-47 sensors and the PIR measurement procedure. It also covers the signal processing algorithms that were developed to obtain accurate PIR measurements in the operational environment.
The presence of an intervening atmosphere can affect the underlying background signature in at least two important ways; 1) by modification of the thermal energy reaching the surface and thus a change in the energy balance driving the dynamics of the underlying thermal scene, and 2) by modification of the propagated signal reaching some specified sensor. Both are described to some extent in the SPIE handbooks. The most obvious effect is that due to direct beam extinction which, in all cases, leads to a reduction of the transmitted thermal energy. This component of the total signal is readily calculated using the well- known Beer's Law, provided that the infrared optical thickness of the intervening atmospheric path is known. In the real atmosphere, however, this effect is indirectly counterbalanced by the effects of multiple scattering and thermal emission which generally gives rise to an enhancement of the thermal energy and is usually more difficult to calculate especially for the case of an aerosol laden or dirty atmosphere. In this paper we build on our previous work by incorporating the enhanced radiation as it affects the sensed background scene using a recently developed aerosol emissivity model, PILOT81, integrated with the U.S. Army COMBIC model using the discrete Gaussian cloud formation.
The Swedish Defence Research Agency (FOI) has recently performed systematic measurements in order to establish an IR-background database. It will be used for a wide range of applications and provide a basis for the modeling of IR-background properties of Swedish terrain. Experimental data like this is also necessary for the validation of methods and programs for synthetic IR-scene simulation. The data was collected from a varied background scene at the FOI site in Linkoeping. Several sensor systems were employed and the most important was a Thermovision 900 that measured for a 24-hour period once every month during a year. Simultaneously, registrations were made with one visual and one near-IR camera. For one day every season hyperspectral images were collected with ScanSpec - an imaging spectrometer designed at FOI. A weather station collected data during all the IR-measurements. The paper describes the data acquisition process, the instrumentation and the contents of the database. Some preliminary statistical analysis of the data is shown as well as an initial validation of a SensorVision IR-scene simulation.
Wide-baseline three-dimensional, stereoscopic imaging is being investigated at ARL as an aid to the separation of targets from clutter during operator-assisted target acquisition. Preliminary experimental results at ARL  indicate that false alarms are decreased and the probability of detection is increased. In this paper we will present a program to produce both infrared and visible band synthetic stereoscopic imagery as well as a methodology to evaluate its usefulness for target detection and clutter rejection. Preliminary images are presented, and qualitatively examined and evaluated.
In order to provide multiple radiometrically characterizable targets for testing in Arnold Engineering Development Center's (AEDC's) 10V chamber, an extension of current scene generation methods is required. New concepts are also being investigated that will allow more flexibility in reaching the desired simulation parameters. Alternate sources, filtering techniques, beam combining methods, and optical power delivery systems may prove useful in meeting the ultimate objectives of the testing program. This paper presents the results of this effort.
We discuss the methodology and techniques employed in transforming our 3D characterization databases and 3D target models from our internal 3D format to a more universal 3D format. Currently our 3D characterization databases and target models are encoded in an internal custom file format that targets specific simulators set up to receive out data. In order to make our databases available to a wider audience within the modeling and simulation community, we have developed techniques to transform our databases into the more common Open Flight file format. We outline the steps taken to accomplish this. We discuss the methodology and show examples of backgrounds, object discretes, and target models. The developed characterization databases are used in digital simulations by various customers within the US Army Aviation and Missile Command (AMCOM). These databases are used in closed loop dynamic simulations to evaluate the performance of various missile systems.
This paper addresses physical perturbation and modeling error issues as applied to W-band signature prediction applications. Specifically, geometry perturbations due to macro-roughness effects are considered at W-band. Both predictive and measured results are presented in order to gain an appreciation of how target geometry deviations manifest themselves in final signature results.
This paper presents some of the image processing techniques that were applied to seek an answer to the question whether agents of the Federal Bureau of Investigation (FBI) directed gunfired against the Branch Davidian complex in the tragic event that took place in Waco, Texas, U.S., 1993. The task for this investigation was to provide a scientific opinion that clarified the cause of the questioned events, or flashes, that can be seen on one of the surveillance videotapes. These flashes were by several experts, concluded to be evidence of gunfire. However, there were many reasons to question the correctness of that conclusion, such as the fact that some of the flashes appeared on a regular basis. The main hypothesis for this work was that the flashes instead were caused by specular solar reflections. The technical approach for this work was to analyze and compare the flashes appearance. By reconstructing the spatial and temporal position of the sensor, the complex and the sun, the geometrical properties was compared to the theoretical appearance of specular solar reflections. The result showed that the flashes seen on the FLIR videotape, were caused by solar or heat reflections from single or multiple objects. Consequently, they could not form evidence of gunfire. Further, the result highlights the importance of considering the characteristics of the imaging system within investigations that utilizes images as information source. This is due to the need of separating real data from other phenomena (such as solar reflections), distortions and artifacts in a correct manner.
The FLIR video recorded by the FBI on 19 April 1993, records the final assault on the Branch Davidian compound in Waco, Texas, and the fire in which some 80 members of the sect died. Attention has focused on a number of flashes recorded on the videotape. The author has examined the 1993 videotape and the recorded videotapes of the re-enactment conducted at Fort Hood, Texas on 19 March 2000. The following conclusions have been reached: 1) The flashes seen on the tape cannot be weapons muzzle flash. Their duration is far too long and their spatial extent is far too great. They are almost certainly the result of solar energy or heat energy form nearby vehicles reflected toward the FLIR by debris or puddles. 2) The FLIR video technology has a very low probability of detecting small arms muzzle flash. 3) As a consequence of 2) above, the absence of muzzle flash detection on the FLIR tape does not prove that no weapons were actually fired during the final assault. Indeed, there is ample evidence (not presented here) that the Davidians fired at the federal agents, but none of their muzzle flashes are detectable on the videotape.
This paper presents the results of two special experiments that were conducted to verify the F.B .1. tapes of April 19th 1993 at Waco, Texas and a follow —on reenactment trail held at Fort Hood ,Texas on March,20 th 2000. The authors consider the results ofthe Fort Hood trails flawed because the rifle type, barrel length, ammunition type, and special events were not considered. In order to rectify this, two special trails were conducted at Fort Collins, Colorado on Sept. 9, 2000 and then at Tucson, Ariz. On Nov.30th. 2000. These trials were conducted at special small arms ranges that could be utilized for these signature experiments. A small FLIRInc Hg.Cd.Te. scanning system of 1990's vintage was used as the primary system. This system had the same scanning characteristics as the original F.B.I. FUR "Nightstalker" as was used at Waco. It is a smaller 2x4 T.D.I. parallel scanning array, feeding analog video into a recorder. In order to compensate for the differences in I.F.O.V. ofthe two systems, [i.e. our system had 2mr, with probably 1/2 mr in the narrow field -of-view vs probably 0.1 mr for "nightstalker' in the same mode }Our system was placed in a Man-High hydraulic lift situated about 250 ft distant form the line of fire, whereas at Waco the "nightstalker" FUR had been aircraft mounted and was circling over the target at a height of about 4000 ft., giving a line-of-sight of about 5000 ft. This system as was the original system used an 8-bit videotape as the recording medium. There is no doubt that the new digital Focal Plane Array systems can capture these transient events This paper presents the results of two special experiments that were conducted to verify the F.B.I. tapes ofApril 19th 1993 at Waco, Texas and a follow —on reenactment trail held at Fort Hood ,Texas on March,20 th 2000. The authors consider the results ofthe Fort Hood trails flawed because the rifle type, barrel length, ammunition type, and special events were not considered. In order to rectify this, two special trails were conducted at Fort Collins, Colorado on Sept. 9, 2000 and then at Tucson, Ariz. On Nov.30th. 2000. These trials were conducted at special small arms ranges that could be utilized for these signature experiments. A small FLIRInc Hg.Cd.Te. scanning system of 1990's vintage was used as the primary system. This system had the same scanning characteristics as the original F.B.I. FUR "Nightstalker" as was used at Waco. It is a smaller 2x4 T.D.I. parallel scanning array, feeding analog video into a recorder. In order to compensate for the differences in I.F.O.V. ofthe two systems, [i.e., but the question is can these old systems with their two fields equaling one frame video technology capture these events? This paper demonstrates that an old system accomplish the task, and do it rather well. By data reduction we have concluded that about 90% ofthe flashes were recorded and about 60% of these were long enough to be captured in both fields. One can detect a single field flash, without a problem, but capturing it takes more sophisticated equipment then was at our command.
The controversy surrounding the origin of flashes on the Mt. Carmel FLIR videotape acquired on April 19, 1993, is introduced. The characteristics of muzzle flash are reviewed. A comparative weapons description is offered. The temporal, spatial, and radiance characteristics of thermal infrared muzzle flash are addressed. Data acquired from a field experiment are presented. The authors conclude that the spatial characteristics of muzzle flash enable its detection by equipment such as the FLIR in use at Mt. Carmel on April 19, 1993; that while flashes obtained in the field appear highly radiant, measurements are necessary to quantify their values; and that the temporal behavior of muzzle flash deserves further study.