Change detection is one of the most important tasks when unmanned aerial vehicles (UAV) are used for video
reconnaissance and surveillance. In this paper, we address changes on short time scale, i.e. the observations are taken
within time distances of a few hours. Each observation is a short video sequence corresponding to the near-nadir
overflight of the UAV above the interesting area and the relevant changes are e.g. recently added or removed objects.
The change detection algorithm has to distinguish between relevant and non-relevant changes. Examples for non-relevant
changes are versatile objects like trees and compression or transmission artifacts. To enable the usage of an
automatic change detection within an interactive workflow of an UAV video exploitation system, an evaluation and
assessment procedure has to be performed. Large video data sets which contain many relevant objects with varying scene
background and altering influence parameters (e.g. image quality, sensor and flight parameters) including image
metadata and ground truth data are necessary for a comprehensive evaluation. Since the acquisition of real video data is
limited by cost and time constraints, from our point of view, the generation of synthetic data by simulation tools has to
In this paper the processing chain of Saur et al. (2014)  and the interactive workflow for video change detection is
described. We have selected the commercial simulation environment Virtual Battle Space 3 (VBS3) to generate synthetic
data. For an experimental setup, an example scenario “road monitoring” has been defined and several video clips have
been produced with varying flight and sensor parameters and varying objects in the scene. Image registration and change
mask extraction, both components of the processing chain, are applied to corresponding frames of different video clips.
For the selected examples, the images could be registered, the modelled changes could be extracted and the artifacts of
the image rendering considered as noise (slight differences of heading angles, disparity of vegetation, 3D parallax) could
be suppressed. We conclude that these image data could be considered to be realistic enough to serve as evaluation data
for the selected processing components. Future work will extend the evaluation to other influence parameters and may
include the human operator for mission planning and sensor control.