In this paper, we describe the problem of efficiently supplying high-level fusion services (situation and impact
assessment) with adequate information using semantic technology and formulate an optimization problem version of it.
We begin by discussing situation awareness and the need for computer tools that assist human analysts and decision
makers with their sense-making. Such tools are necessary in part because of the vast amount of information that is
available for analysis in today's command and control systems: the human operators need help to sort out the relevant
parts. This kind of filtering requirement is however not limited to humans: automatic or semi-automatic fusion tools also
need to limit the information they use in their processing. Simple such filtering could be done based on geographical
location, but as the number of advanced fusion services used in the command and control system increases, more
advanced techniques need to be used. We describe the information supply process when dealing with several (possibly
heterogeneous) sources of differing quality and describe the concepts of information view and information scope. We
describe how semantic queries can be used to achieve such filtering, and in particular describe this implemented for
Impactorium, a framework tool for situation and impact assessment developed by FOI. The threat models in
Impactorium previously relied solely on simple indicator tags for information supply. This can be done more robustly by
adding semantic queries to the threat models. The paper concludes with a summary and some discussion of future work
in this area.
The aim of maritime surveillance systems is to detect threats early enough to take appropriate actions. We present the
results of a study on maritime domain awareness performed during the fall of 2008. We analyze an identified capability
gap of worldwide surveillance in the maritime domain, and report from a user workshop addressing the identified gap.
We describe a SMARTracIn concept system that integrates information from surveillance systems with background
knowledge on normal conditions to help users detect and visualize anomalies in vessel traffic. Land-based systems that
cover the coastal waters as well as airborne, space-borne and ships covering open sea are considered. Sensor data are
combined with intelligence information from ship reporting systems and databases. We describe how information fusion,
anomaly detection and semantic technology can be used to help users achieve more detailed maritime domain awareness.
Human operators are a vital part of this system and should be active components in the fusion process. We focus on the
problem of detecting anomalous behavior in ocean-going traffic, and a room and door segmentation concept to achieve
this. This requires the ability to identify vessels that enter into areas covered by sensors as well as the use of information
management systems that allow us to quickly find all relevant information.
Sensor allocation and threat analysis are difficult fusion problem that can sometimes be approximately solved using simulations of the
future movement of adversary units. In addition to requiring detailed motion models, such simulation also requires large amounts of computational resources, since a large number of possibilities
must be examined. In this paper, we extend our previously introduced framework for doing such simulations more efficiently. The framework is based on defining equivalence classes of future paths of a set of units. In the simplest case, two paths are considered equivalent if they give rise to the same set of observations. For sensor management, each considered sensor plan thus entails an equivalence relation on the set of future paths. This can be used to significantly reduce the number of "alternative futures" that need to be considered for the simulation. For threat analysis, the equivalence relation can instead be based on the perceived threat against own units. We describe how the equivalence classes induced by such relations could be used to improve the visualization of threat analysis systems. User interaction can also be used to refine the equivalence classes; we argue that such interaction will be essential for international operations where is it difficult to define actors and targets.
Many optimization problems that arise in multi-target tracking and fusion applications are known to be NP-complete, ie, believed to have worst-case complexities that are exponential in problem size. Recently, many such NP-complete problems have been shown to display threshold phenomena: it is possible to define a parameter such that the probability of a random problem instance having a solution jumps from 1 to 0 at a specific value of the parameter. It is also found that the amount of resources needed to solve the problem instance peaks at the transition point.
Among the problems found to display this behavior are graph coloring (aka clustering, relevant for multi-target tracking), satisfiability (which occurs in resource allocation and planning problem), and the travelling salesperson problem.
Physicists studying these problems have found intriguing similarities to phase transitions in spin models of statistical mechanics. Many
methods previously used to analyze spin glasses have been used to explain some of the properties of the behavior at the transition point. It turns out that the transition happens because the fitness landscape of the problem changes as the parameter is varied. Some algorithms have been introduced that exploit this knowledge of the structure of the fitness landscape. In this paper, we review some of the experimental and theoretical work on threshold phenomena in optimization problems and indicate how optimization problems from tracking and sensor resource allocation could be analyzed using these results.
Consider the problem of tracking a set of moving targets. Apart from the tracking result, it is often important to know where the tracking fails, either to steer sensors to that part of the state-space, or to inform a human operator about the status and quality of the obtained information. An intuitive quality measure is the correlation between two tracking results based on uncorrelated observations. In the case of Bayesian trackers such a correlation measure could be the Kullback-Leibler difference. We focus on a scenario with a large number of military units moving in some terrain. The units are observed by several types of sensors and
"meta-sensors" with force aggregation capabilities. The sensors register units of different size. Two separate multi-target probability hypothesis density (PHD) particle filters are used to track some type of units (e.g., companies) and their sub-units
(e.g., platoons), respectively, based on observations of units of those sizes. Each observation is used in one filter only. Although the state-space may well be the same in both filters, the posterior
PHD distributions are not directly comparable -- one unit might
correspond to three or four spatially distributed sub-units. Therefore, we introduce a mapping function between distributions for different unit size, based on doctrine knowledge of unit configuration. The mapped distributions can now be compared -- locally or globally -- using some measure, which gives the correlation between two PHD distributions in a bounded volume of the state-space. To locate areas where the tracking fails, a discretized quality map of the state-space can be generated by applying the measure locally to different parts of the space.
We describe the recently introduced extremal optimization algorithm and apply it to target detection and association problems arising in pre-processing for multi-target tracking. Extremal optimization
is based on the concept of self-organized criticality, and has been used successfully for a wide variety of hard combinatorial optimization problems. It is an approximate local search algorithm that achieves its success by utilizing avalanches of local changes that allow it to explore a large part of the search space. It is somewhat similar to genetic algorithms, but works by selecting and changing bad chromosomes of a bit-representation of a candidate solution. The algorithm is based on processes of self-organization
found in nature. The simplest version of it has no free parameters, while the most widely used and most efficient version has one parameter. For extreme values of this parameter, the methods reduces to hill-climbing and random walk searches, respectively. Here we consider the problem of pre-processing for multiple target tracking when the number of sensor reports received is very large and arrives in large bursts. In this case, it is sometimes necessary to pre-process reports before sending them to tracking modules in the fusion system. The pre-processing step associates reports to known
tracks (or initializes new tracks for reports on objects that have not been seen before). It could also be used as a pre-process step before clustering, <i>e.g.</i>, in order to test how many clusters to use. The pre-processing is done by solving an approximate version of the original problem. In this approximation, not all pair-wise conflicts are calculated. The approximation relies on knowing how many such pair-wise conflicts that are necessary to compute. To determine this, results on phase-transitions occurring when coloring (or clustering) large random instances of a particular graph ensemble are used.