It is typically assumed in calibrating emitter array projection systems that the radiated spectrum is Planckian and that
intervening optics attenuate the signal but do not change the spectral shape significantly. Calibrating such a system is
relatively easy in that blackbody reference sources are available to calibrate the unit under test (UUT), or other sensor
with similar spectral responsivity, which can then be used as a transfer standard for array calibration. In this way the
projector command value required to produce the same response in the UUT as the modeled object is readily obtained.
With a visible projector, this is not the case. The modeled object spectrum is often solar reflective. To calibrate using
the same approach as infrared systems would require a 5800 K blackbody. Furthermore, the spectrum of the visible
output in a multispectral, common boresight projection system can differ pathologically from the visible projector
subsystem alone because of dichroic beam combiner characteristics. This paper describes a process developed to
calibrate a visible projector in such a system without even having the UUT or spectrally equivalent surrogate available as
a transfer standard.
This paper is a continuation of the merging of two dynamic infrared scene projector technologies to provide a
unique and innovative solution for the simulation of high dynamic temperature ranges for testing infrared imaging
sensors. This paper will present some of the challenges and performance issues encountered in implementing this unique
projector system into a Hardware-in-the-Loop (HWIL) simulation facility.
The projection system combines the technologies of a Honeywell BRITE II extended voltage range emissive
resistor array device and an optically scanned laser diode array projector (LDAP). The high apparent temperature
simulations are produced from the luminescent infrared radiation emitted by the high power laser diodes. The hybrid
infrared projector system is being integrated into an existing HWIL simulation facility and is used to provide real-world
high radiance imagery to an imaging infrared unit under test. The performance and operation of the projector is
presented demonstrating the merit and success of the hybrid approach. The high dynamic range capability simulates a
250 Kelvin apparent background temperature to 850 Kelvin maximum apparent temperature signatures. This is a large
increase in radiance projection over current infrared scene projection capabilities.
AMRDEC has successfully tested hardware and software for Real-Time Scene Generation for IR and SAL Sensors on COTS PC based hardware and video cards. AMRDEC personnel worked with nVidia and Concurrent Computer Corporation to develop a Scene Generation system capable of frame rates of at least 120Hz while frame locked to an external source (such as a missile seeker) with no dropped frames. Latency measurements and image validation were performed using COTS and in-house developed hardware and software. Software for the Scene Generation system was developed using OpenSceneGraph.
Modern, military scene generators increasingly utilize advanced features of consumer graphics hardware to produce wave band-specific sensor scenes. Unfortunately the advances available in the consumer graphics accelerator market do not translate immediately into product applications for military scene generators required to test next generation sensors. Testing infrared (IR) sensors used in terminal homing missiles and missile warning systems (MWS) require generating frame rates of 200 Hz or more. Modern IR emitter arrays are now able to project dynamic scenes at this higher rate, however personal computer (PC) based scene rendering systems cannot generate high-resolution, real-time frames fast enough. InterSpace has leveraged its high-speed pixel processor technology to produce high-speed rendering based on PC devices. The Hardware-in-the-Loop Functional Area of the US Army Aviation and Missile Research Development and Engineering Center has developed a suite of modular software to perform deterministic real-time, wave band-specific rendering of sensor scenes, leveraging the features of commodity graphics hardware and open source software. Together, these technologies provide the performance and accuracy to drive high-rate, high-dynamic range scene projectors.
This paper describes the current research and development of advanced scene generation technology for integration into the Advanced Multispectral Simulation Test and Acceptance Resource (AMSTAR) Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC and US Army Redstone Technical Test Center at Redstone Arsenal, AL. A real-time multi-mode (infra-red (IR) and semi-active laser (SAL)) scene generator for a tactical sensor system has been developed leveraging COTS hardware and open source software (OSS). A modular, plug-in architecture has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform-independent software yields a cost-effective upgrade path to integrate best-of-breed personal computer (PC) graphics processing unit (GPU) technology.
This paper describes the current research and development of advanced scene generation technology for integration into the I2RSS - Hardware-in-the-Loop (HWIL) facilities at the US Army AMRDEC at Redstone Arsenal, AL. A real-time dynamic infra-red (IR) scene generator has been developed in support of a high altitude scenario leveraging COTS hardware and open source software. The Multi-Spectral Mode Scene Generator (MMSG) is an extensible software architecture that is powerful yet flexible. The I2RSS scene generator has implemented dynamic signature by integrating the signature prediction codes along with Open Source Software, COTS hardware along with custom built interfaces. A modular, plug-in framework has been developed that supports rapid reconfiguration to permit the use of a variety of state data input sources, geometric model formats, and signature and material databases. The platform independent software yields a cost-effective upgrade path to integrate best-of-breed graphics and system architectures.
This paper describes the current research in integrating Personal Computer technology into the U.S. Army Aviation and Missile Command (AMCOM) Hardware-in-the-Loop (HWIL) facilities. Using both COTS hardware along with custom built interfaces; the system under development will be used to replace high-end graphics workstations that provide infrared image generation. Infrared scene generation is an integral component in the HWIL testing of missile seeker units. This functionality must be more accessible, portable, and affordable as HWIL testing becomes more integral and more widely distributed in the development life cycle of missile systems. The graphics system under development is designed to be a more feasible plug-in replacement for existing infrared scene generation systems. Real-time performance and support of existing interfaces to simulation computers, projectors, and missile components are the primary considerations in designing this system.
This paper describes infrared (IR) scene generation and validation activities at the U.S. Army Aviation and Missile Command's (AMCOM) Dual-Mode Hardware-in-the-Loop (HWIL) Simulation. The HWIL simulation validation results are based on comparison of infrared seeker data collected in the HWIL simulation to infrared seeker data collected during captive flight tests (CFTs). Use of CFT data allows a simulation developer to quantify not only the radiometric fidelity of the simulation inputs, but also the effects that any limitations of the inputs may have on simulation validity with respect to a particular seeker and its algorithms. Validation of this type of simulation is a complex process and all aspects of the validation are covered. Topics include real-time IR signature modeling and validation, simulation output verification, projected energy verification, and total end-to-end simulation validation. Also included are descriptions of the different types of CFT scenarios necessary for simulation validation and the comparison methodologies used for each case.