The Human Resources Laboratory has developed a dynamic IR scene simulation system to support research on simulation requirements for Forward Looking Infrared (FLIR) sensors in fighter aircraft. A primary objective in the development of this system was that FLIR imagery be part of an overall simulation package including navigation and targeting FLIR, Synthetic Aperture Radar (SAR), and out-the-window visual imagery. FLIR imagery is derived from a computer-generated visual scene. To simulate FLIR, color codes for objects in the visual data base are replaced with gray-scale values based on predicted IR exitance. The gray-scale values are constructed using software developed by the Applications Research Corporation which combines feature and surface codes from the Defense Mapping Agency source data together with wavelength dependent absorptivity and run-time scenario specifications including gaming area latitude and longitude, time of day, season, and weather to estimate radiant exitance for each object. Exitance values are then converted into gray-scale values. Sensor effects such as gain level, polarity, field of view, and modulation transfer function are added to the FLIR imagery by a General Electric-developed post-processor. Sensor characteristics can be separately programmed for a navigation FLIR which is presented on a simulated Heads-Up Display (HUD) and for a targeting FLIR. The effects of user inputs to the IR prediction software on radiant exitance, functions of the post-processor, and selection of texture patterns will be discussed. Fidelity requirements for simulation of FLIR in flight training will also be addressed.