The paper defines three distinct classes of binary fusion, extending an evolving first-principles-based
theoretical fusion framework under development for several years. The paper focuses on non-traditional data sources
due to its relevance to the development of a comprehensive fusion theory. Three fusion classes are defined and
discussed relative to both conventional hard target and text-based information fusion applications. The concept of
entity specificity is then introduced to generalize the three-class binary fusion problem. Finally, fusion class 1 and 2
products from a prototype fusion system developed for the US Department of the Army are presented to clarify the
concepts; class-3 fusion applications to soft data will be addressed in a future paper.
A fusion system that accommodates both text-based extracted information along with
more conventional sensor-derived input has been developed and demonstrated in a terrorist attack
scenario as part of the Empire Challenge (EC) 09 Exercise. Although the fusion system was
developed to support Army military analysts, the system, based on a set of foundational fusion
principles, has direct applicability to department of homeland security (DHS) & defense, law
enforcement, and other applications.
Several novel fusion technologies and applications were demonstrated in EC09. One such
technology is location normalization that accommodates both fuzzy semantic expressions such as
behind Library A, across the street from the market place, as well as traditional spatial
representations. Additionally, the fusion system provides a range of fusion products not
supported by traditional fusion algorithms. Many of these additional capabilities have direct
applicability to DHS.
A formal test of the fusion system was performed during the EC09 exercise. The system
demonstrated that it was able to (1) automatically form tracks, (2) help analysts visualize
behavior of individuals over time, (3) link key individuals based on both explicit message-based
information as well as discovered (fusion-derived) implicit relationships, and (4) suggest
possible individuals of interest based on their association with High Value Individuals (HVI) and
user-defined key locations.
The paper presents a formal approach for mapping from an entity-relationship model of a selected application domain
to the functional components of the JDL fusion model. The resultant functional decomposition supports both
traditional sensor, as well as human-generated text input. To demonstrate the generality of the mapping, examples are
offered for three distinct application domains: (1) Intelligence Fusion, (2) Aircraft Collision Avoidance, and (3)
Robotic Control. The first-principle's based approach begins by viewing fusion as the composition of similar and
dissimilar entities. Next, the fusion triple (entity, location, time) is defined where entities can be either physical or
non-physical. Coupling the fusion triple with this generalized view of fusion leads to the identification of eight base-level
fusion services that serve as the building blocks of individual composition products.
Extensions to a previously developed service-based fusion process model are presented. The model
accommodates (1) traditional sensor data and human-generated input, (2) streaming and non-streaming data feeds, and
(3) the fusion of both physical and non-physical entities. More than a dozen base-level fusion services are identified.
These services provide the foundation functional decomposition of levels 0 - 2 in JDL fusion model. Concepts, such as
clustering, link analysis and database mining, that have traditionally been only loosely associated with the fusion
process, are shown to play key roles within this fusion framework. Additionally, the proposed formulation extends the
concepts of tracking and cross-entity association to non-physical entities, as well as supports effective exploitation of a
priori and derived context knowledge. Finally, the proposed framework is shown to support set theoretic properties, such
as equivalence and transitivity, as well as the development of a pedigree summary metric that characterizes the
informational distance between individual fused products and source data.