A neural network is used to extract the flight model of guided, short to medium range, tripod and shoulder-fired missile systems which is then integrated into a training simulator. The simulator uses injected video to replace the optical sight and is fitted with a multi-axis positioning system which senses the gunner's movement. The movement creates an image shift and affects the input data to the missile control algorithm. Accurate flight dynamics are a key to efficient training, particularly in the case of closed loop guided systems. However, flight model data is not always available, either because it is proprietary, or because it is too complex to embed in a real time simulator. A solution is to reverse engineer the flight model by analyzing the missile's response when submitted to typical input conditions. Training data can be extracted from either recorded video or from a combination of weapon and missile positioning data. The video camera can be located either on the weapon or attached to a through-sight adapter. No knowledge of the missile flight transfer function is used in the process. The data is fed to a three-layer back-propagation type neural network. The network is configured within a standard spreadsheet application and is optimized with the built-in solver functions. The structure of the network, the selected inputs and outputs, as well as training data, output data after training, and output data when embedded in the simulator are presented.