It is typically assumed in calibrating emitter array projection systems that the radiated spectrum is Planckian and that
intervening optics attenuate the signal but do not change the spectral shape significantly. Calibrating such a system is
relatively easy in that blackbody reference sources are available to calibrate the unit under test (UUT), or other sensor
with similar spectral responsivity, which can then be used as a transfer standard for array calibration. In this way the
projector command value required to produce the same response in the UUT as the modeled object is readily obtained.
With a visible projector, this is not the case. The modeled object spectrum is often solar reflective. To calibrate using
the same approach as infrared systems would require a 5800 K blackbody. Furthermore, the spectrum of the visible
output in a multispectral, common boresight projection system can differ pathologically from the visible projector
subsystem alone because of dichroic beam combiner characteristics. This paper describes a process developed to
calibrate a visible projector in such a system without even having the UUT or spectrally equivalent surrogate available as
a transfer standard.
A sensor system for the characterization of infrared laser radar scene projectors has been developed. Available sensor
systems do not provide sufficient range resolution to evaluate the high precision LADAR projector systems developed
by the U.S. Army Research, Development and Engineering Command (RDECOM) Aviation and Missile Research,
Development and Engineering Center (AMRDEC). With timing precision capability to a fraction of a nanosecond, it
can confirm the accuracy of simulated return pulses from a nominal range of up to 6.5 km to a resolution of 4cm.
Increased range can be achieved through firmware reconfiguration. Two independent amplitude triggers measure both
rise and fall time providing a judgment of pulse shape and allowing estimation of the contained energy. Each return
channel can measure up to 32 returns per trigger characterizing each return pulse independently. Currently efforts
include extending the capability to 8 channels. This paper outlines the development, testing, capabilities and limitations
of this new sensor system.
This paper is a continuation of the merging of two dynamic infrared scene projector technologies to provide a
unique and innovative solution for the simulation of high dynamic temperature ranges for testing infrared imaging
sensors. This paper will present some of the challenges and performance issues encountered in implementing this unique
projector system into a Hardware-in-the-Loop (HWIL) simulation facility.
The projection system combines the technologies of a Honeywell BRITE II extended voltage range emissive
resistor array device and an optically scanned laser diode array projector (LDAP). The high apparent temperature
simulations are produced from the luminescent infrared radiation emitted by the high power laser diodes. The hybrid
infrared projector system is being integrated into an existing HWIL simulation facility and is used to provide real-world
high radiance imagery to an imaging infrared unit under test. The performance and operation of the projector is
presented demonstrating the merit and success of the hybrid approach. The high dynamic range capability simulates a
250 Kelvin apparent background temperature to 850 Kelvin maximum apparent temperature signatures. This is a large
increase in radiance projection over current infrared scene projection capabilities.
We present a technique for the correction of spatial non-uniformity in an infrared emitter array projection system for flood scenes. The technique is a sparse grid approach, but, instead of turning on a sparse grid of emitters to estimate the radiance of each one, the array is commanded uniformly to a constant level, and a sparse grid of emitters is turned off. The resultant loss of radiance in the neighborhood of each emitter is used to estimate its response. Typically, less than one percent of the emitters are turned off in a grid, so flood scene effects, such as substrate heating, are accounted for without the complexity of coupled outputs of a full flood process.
Recent advances in real-time synthetic scene generation for Hardware-in-the-loop (HWIL) testing at the U.S. Army Aviation and Missile Command (AMCOM) Aviation and Missile Research, Development, and Engineering Center (AMRDEC) improve both performance and fidelity. Modeling ground target scenarios requires tradeoffs because of limited texture memory for imagery and limited main memory for elevation data. High- resolution insets have been used in the past to provide better fidelity in specific areas, such as in the neighborhood of a target. Improvements for ground scenarios include smooth transitions for high-resolution insets to reduce high spatial frequency artifacts at the borders of the inset regions and dynamic terrain paging to support large area databases. Transport lag through the scene generation system, including sensor emulation and interface components, has been dealt with in the past through the use of sub-window extraction from oversize scenes. This compensates for spatial effects of transport lag but not temporal effects. A new system has been developed and used successfully to compensate for a flashing coded beacon in the scene. Other techniques have been developed to synchronize the scene generator with the seeker under test (SUT) and to model atmospheric effects, sensor optic and electronics, and angular emissivity attenuation.
The U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Engineering, and Development Center (MRDEC) Advanced Simulation Center has recognized the need for re- configurable visualization in support of hardware-in-the- loop (HWIL) simulations. AMCOM MRDEC made the development of re-configurable visualization tools a priority. SimSight, developed at AMCOM MRDEC, is designed to provide 3D visualization to HWIL simulations and after action reviews. Leveraging both the latest hardware and software visual simulation technologies, SimSight displays a concise, 3D view of the simulated world providing the HWIL engineer with unprecedented power to analyze quickly the progress of a simulation from pre-launch to impact. Providing 3D visualization is only half the solution; data management, distribution, and analysis is the companion problem being dealt with by AMCOM MRDEC with the development of Fulcrum, a cross-platform data capture, distribution, analysis, and display framework of which SimSight will become a component.
As cost becomes an increasingly important factor in the development and testing of Infrared sensors and flight computer/processors, the need for accurate hardware-in-the- loop (HWIL) simulations is critical. In the past, expensive and complex dedicated scene generation hardware was needed to attain the fidelity necessary for accurate testing. Recent technological advances and innovative applications of established technologies are beginning to allow development of cost-effective replacements for dedicated scene generators. These new scene generators are mainly constructed from commercial-off-the-shelf (COTS) hardware and software components. At the U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Development, and Engineering Center (MRDEC), researchers have developed such a dynamic IR scene generator (IRSG) built around COTS hardware and software. The IRSG is used to provide dynamic inputs to an IR scene projector for in-band seeker testing and for direct signal injection into the seeker or processor electronics. AMCOM MRDEC has developed a second generation IRSG, namely IRSG2, using the latest Silicon Graphics Incorporated (SGI) Onyx2 with Infinite Reality graphics. As reported in previous papers, the SGI Onyx Reality Engine 2 is the platform of the original IRSG that is now referred to as IRSG1. IRSG1 has been in operation and used daily for the past three years on several IR projection and signal injection HWIL programs. Using this second generation IRSG, frame rates have increased from 120 Hz to 400 Hz and intensity resolution from 12 bits to 16 bits. The key features of the IRSGs are real time missile frame rates and frame sizes, dynamic missile-to-target(s) viewpoint updated each frame in real-time by a six-degree-of- freedom (6DOF) system under test (SUT) simulation, multiple dynamic objects (e.g. targets, terrain/background, countermeasures, and atmospheric effects), latency compensation, point-to-extended source anti-aliased targets, and sensor modeling effects. This paper provides a comparison between the IRSG1 and IRSG2 systems and focuses on the IRSG software, real time features, and database development tools.
As cost becomes an increasingly important factor in the development and testing of infrared (IR) sensors and flight computer/processors, the need for accurate hardware-in-the- loop simulations is critical. In the past, expensive and complex dedicated scene generation hardware was needed to attain the fidelity necessary for accurately testing systems under test (SUT). Recent technological advances and innovative applications of established technologies are beginning to allow development of cost effective replacements for dedicated scene generators. These new scene generators are mainly constructed from commercial off-the- shelf (COTS) hardware and software components. At the U.S. Army Missile Command (MICOM) researchers have developed such a dynamic IR scene generator (IRSG) built around COTS hardware and software. The IRSG is being used to provide inputs to an IR scene projector for in-band sensor testing and for direct signal injection into the sensor or processor electronics. Using this `baseline' IRSG, up to 120 frames per second (Hz) of 12-bit intensity images are being generated at 640 by 640 pixel resolution. The IRSG SUT-to- target viewpoint is dynamically updated in real time by a six-degree-of-freedom SUT simulation executing on a facility simulation computer, synchronized with an external signal from the SUT hardware, and compensates for system latency using a special purpose hardware component implemented on a single VME card. Multiple dynamic targets, terrain/backgrounds, countermeasures, and atmospheric effects in real time by the facility simulation computer via a shared memory interface to the IRSG. The `next generation' IRSG is currently under development at MICOM using `next generation' COTS hardware and software. `Next generation' performance specifications are estimated to yield 16-bit intensity, 250 - 300 Hz frame rate, at 1024 X 1024 pixel resolution.