In this paper, we describe the development of a semi synthetic scene generator. It takes parts of real recorded infrared images or videos and combines them anew. All scene parts are essentially two-dimensional textures, which also may contain transparent regions. It can be time invariant textures or time variant videos. As scene parts the sky, the water surface, and the target ship are used. Additional scene parts, such as counter measures, can be included. The sky is placed very far away and orthogonal to the viewing direction. The water surface is placed parallel to the viewing direction. During the approach, it must be cross-faded from time to time. The target textures are cropped from videos with a ship; where the ship is relatively close, in order to have fine resolution. The target is placed roughly orthogonal to the viewing direction but closer than the sky, and its distance decreases. Missile paths can be generated arbitrarily, as long as the viewing angles do not deviate too much from the direct approach. Comparably large numbers of semi synthetic approaches can be generated in order to assess the threat to the ship. It is also possible to include the semi synthetic rendering into a control loop together with tracking algorithms as they are assumed to be used in infrared guided anti-ship missiles. The main advantage as compared to full-synthetic closed loop rendering is in the drastically lower computational effort, while the synthesized data are quite realistic.