The development and evaluation of precision strike weaponry requires high fidelity image simulation, as data collections involving moving platforms are difficult to schedule and costly to perform. Furthermore, live data collections where the weapon is being guided by an autonomous target acquisition (ATA) system cannot be performed in dense urban environments. The only solution is to develop high fidelity image and navigation simulations of realistic operating environments. We are currently developing a system that automatically generates a detailed urban scene requiring minimal user input. Given a set of parameters such as population, terrain, and city style, the system generates a two-dimensional city plan containing features such as road networks, buildings, vehicles, vegetation, and miscellaneous additional urban objects. The two-dimensional city representation is then processed by an interactive scene modeling and simulation environment that generates a textured, high-resolution, three-dimensional representation of the scene in a format compatible with well-known LADAR and IR sensor simulation suites such as IRMA. At each step in the process, the user has the ability to interact with the scene, whether to change specific scene parame-ters or to manually insert, remove, or modify targets and objects of interest.