Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of
responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism,
detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a
potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is
mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect
and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational
facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the
operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As
they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to
support them by inferring the necessary facts, ultimately providing indications and warning on a small number of
anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype
of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.
This paper presents a MISR Visualization Experimental Environment which provides support to the development,
evaluation, experimentation, and transitioning of information visualization approaches to emulate the essential elements
of a future RMP. The environment provides a means of integrating and sharing the output of visualization tools; storing,
accessing and managing showcase examples of visual representations via an underlying visualization reference model;
and providing access to underlying data sources supplied through simulation, representative data, and operational data.
Visualization design and experimentation activities are also briefly introduced. The MISR experimental environment
may be used to characterize the various techniques evaluated and the results of experiments will be introduced as inputs
of the environment knowledge base. It is expected that the experimentation undertaken, supported by a MISR
experimental environment, will identify novel visualization approaches to be integrated in the future RMP and have the
potential to enhance maritime domain awareness.
Defence R&D Canada is developing a Collaborative Knowledge Exploitation Framework (CKEF) to support the
analysts in efficiently managing and exploiting relevant knowledge assets to achieve maritime domain awareness in joint
operations centres of the Canadian Forces. While developing the CKEF, anomaly detection has been clearly recognized
as an important aspect requiring R&D. An activity has thus been undertaken to implement, within the CKEF, a proof-of-concept
prototype of a rule-based expert system to support the analysts regarding this aspect. This expert system has to
perform automated reasoning and output recommendations (or alerts) about maritime anomalies, thereby supporting the
identification of vessels of interest and threat analysis. The system must contribute to a lower false alarm rate and a
better probability of detection in drawing operator's attention to vessels worthy of their attention. It must provide
explanations as to why the vessels may be of interest, with links to resources that help the operators dig deeper.
Mechanisms are necessary for the analysts to fine tune the system, and for the knowledge engineer to maintain the
knowledge base as the expertise of the operators evolves. This paper portrays the anomaly detection prototype, and
describes the knowledge acquisition and elicitation session conducted to capture the know-how of the experts, the formal
knowledge representation enablers and the ontology required for aspects of the maritime domain that are relevant to
anomaly detection, vessels of interest, and threat analysis, the prototype high-level design and implementation on the
service-oriented architecture of the CKEF, and other findings and results of this ongoing activity.
Situation awareness has emerged as an important concept in military and public security environments. Situation analysis is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of situation awareness for the decision maker(s). It is well established that information fusion, defined as the process of utilizing one or more information sources over time to assemble a representation of aspects of interest in an environment, is a key enabler to meeting the demanding requirements of situation analysis. However, although information fusion is important, developing and adopting a knowledge-centric view of situation analysis should provide a more holistic perspective of this process. This is based on the notion that <i>awareness</i> ultimately has to do with <i>having knowledge of something</i>. Moreover, not all of the situation elements and relationships of interest are directly observable. Those aspects of interest that cannot be observed must be inferred, i.e., derived as a conclusion from facts or premises, or by reasoning from evidence. This paper discusses aspects of knowledge, and how it can be acquired from experts, formally represented and stored in knowledge bases to be exploited by computer programs, and validated. Knowledge engineering is reviewed, with emphasis given to cognitive and ontological engineering. Facets of reasoning are discussed, along with inferencing methods that can be used in computer applications. Finally, combining elements of information fusion and knowledge-based systems, an overall approach and framework for the building of situation analysis support systems is presented.
Situation analysis is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of situation awareness, for the decision maker. Data fusion is a key enabler to meeting the demanding requirements of military situation analysis support systems. According to the data fusion model maintained by the Joint Directors of Laboratories' Data Fusion Group, impact assessment estimates the effects on situations of planned or estimated/predicted actions by the participants, including interactions between action plans of multiple players. In this framework, the appraisal of actual or potential threats is a necessary capability for impact assessment. This paper reviews and discusses in details the fundamental concepts of threat analysis. In particular, threat analysis generally attempts to compute some threat value, for the individual tracks, that estimates the degree of severity with which engagement events will potentially occur. Presenting relevant tracks to the decision maker in some threat list, sorted from the most threatening to the least, is clearly in-line with the cognitive demands associated with threat evaluation. A key parameter in many threat value evaluation techniques is the Closest Point of Approach (CPA). Along this line of thought, threatening tracks are often prioritized based upon which ones will reach their CPA first. Hence, the Time-to-CPA (TCPA), i.e., the time it will take for a track to reach its CPA, is also a key factor. Unfortunately, a typical assumption for the computation of the CPA/TCPA parameters is that the track velocity will remain constant. When a track is maneuvering, the CPA/TCPA values will change accordingly. These changes will in turn impact the threat value computations and, ultimately, the resulting threat list. This is clearly undesirable from a command decision-making perspective. In this regard, the paper briefly discusses threat value stabilization approaches based on neural networks and other mathematical techniques.
Situation Awareness (SAW) is essential for commanders to conduct decision-making (DM) activities. Situation Analysis (SA) is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of SAW for the decision maker. Operational trends in warfare put the situation analysis process under pressure. This emphasizes the need for a real-time computer-based Situation analysis Support System (SASS) to aid commanders in achieving the appropriate situation awareness, thereby supporting their response to actual or anticipated threats. Data fusion is clearly a key enabler for SA and a SASS. Since data fusion is used for SA in support of dynamic human decision-making, the exploration of the SA concepts and the design of data fusion techniques must take into account human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight integration of the human element with the SA technology is essential. Regarding these issues, this paper provides a description of CODSI (Command Decision Support Interface), and operational- like human machine interface prototype for investigations in computer-based SA and command decision support. With CODSI, one objective was to apply recent developments in SA theory and information display technology to the problem of enhancing SAW quality. It thus provides a capability to adequately convey tactical information to command decision makers. It also supports the study of human-computer interactions for SA, and methodologies for SAW measurement.
Sensor Management (SM) has to do with how to best manage, coordinate and organize the use of sensing resources in a manner that synergistically improves the process of data fusion. Based on the contextual information, SM develops options for collecting further information, allocates and directs the sensors towards the achievement of the mission goals and/or tunes the parameters for the realtime improvement of the effectiveness of the sensing process. Conscious of the important role that SM has to play in modern data fusion systems, we are currently studying advanced SM Concepts that would help increase the survivability of the current Halifax and Iroquois Class ships, as well as their possible future upgrades. For this purpose, a hierarchical scheme has been proposed for data fusion and resource management adaptation, based on the control theory and within the process refinement paradigm of the JDL data fusion model, and taking into account the multi-agent model put forward by the SASS Group for the situation analysis process. The novelty of this work lies in the unified framework that has been defined for tackling the adaptation of both the fusion process and the sensor/weapon management.
This paper presents the progress of a collaborative effort between Canada and The Netherlands in analyzing multi-sensor data fusion systems, e.g. for potential application to their respective frigates. In view of the overlapping interest in studying and comparing applicability and performance and advanced state-of-the-art Multi-Sensor Data FUsion (MSDF) techniques, the two research establishments involved have decided to join their efforts in the development of MSDF testbeds. This resulted in the so-called Joint-FACET, a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. Joint-FACET allows the user to create and edit test scenarios with multiple ships, sensor and targets, generate realistic sensor outputs, and to process these outputs with a variety of MSDF algorithms. These MSDF algorithms can also be tested using typical experimental data collected during live military exercises.
KEYWORDS: Sensors, Sensor fusion, Data fusion, Telecommunications, Detection and tracking algorithms, Data processing, Computer architecture, Monte Carlo methods, Filtering (signal processing), Data communications
This paper presents a detailed discussion of clustering as applied to multiple hypothesis tracking (MHT). The combinatorial problem associated with forming multiple data association hypotheses can be reduced significantly by partitioning the entire set of system tracks and input data elements into separate clusters. Cluster management, a process that deals with cluster formation, merging, splitting and deletion, is thus motivated by the idea that a large tracking problem can be divided into a number of smaller problems that can be solved independently. The paper emphasizes on the cluster splitting process since it is the most difficult aspect of clustering while being an often neglected issue in the target tracking literature. The hypothesis dependencies that must be taken into account when one attempts to split the hypothesis tree of a cluster into two or more independent trees are discussed. This is an important issue since the hypotheses within a cluster must not interact with the hypotheses contained within other clusters for the MHT technique to remain consistent. A very efficient algorithm is described that performs a combined split-merge process simultaneously for all the clusters. The algorithm has been designed to avoid a waste of computer resources that may happen when splitting clusters that should have been kept merged according to the most recent input data set. The dynamic data structure that is used to implement the hypothesis tree is described as a key element of the approach efficiency. An example of cluster management is presented.
This paper presents the results of a quantitative comparison of two architectural options in developing a multi-sensor data fusion system. One option is the centralized architecture: a single track file is maintained and updated using raw sensor measurements. The second option is the autonomous sensor fusion architecture: each sensor maintains its own track file. The sensor tracks are then transmitted to a central processor responsible for fusing this data to form a master track file. Various performance trade-offs will typically be required in the selection of the best multi-sensor data fusion architecture since each approach has different benefits and disadvantages. The emphasis for this study is given to measuring the quality of the fused conducted with the CASE_ATTI (concept analysis and simulation environment for automatic target racking and identification) testbed. This testbed provides the algorithm-level test and replacement capability required to conduct this kind of performance study.
This paper presents simulations of IR and radar surveillance sensors to support an ongoing Multi Sensor Data Fusion (MSDF) performance evaluation study for potential application to the Canadian Patrol Frigate midlife upgrade. The surveillance sensor models are used in an algorithm-level testbed that allows the investigation of advanced MSDF concepts. The sensor models take into account sensor's design parameters and external environmental effects such as clutter, clouds, propagation and jamming. The latest findings regarding the dominant perturbing effects affecting the sensor detection performance are included. The sensor models can be used to generate contacts and false alarms in scenarios for multi-sensor data fusion studies while the scenarios is running.