The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a
campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few
COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition
based to effects based strategies, as well as the complexities of 4th generation warfare and asymmetric adversaries have
placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment,
planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation
are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the
feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope
simulations. This paper will discuss a case study in which the scenario generation capability was employed to support
COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation
runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The
paper will discuss how scenario generation technology can be employed to allow military commanders and mission
planning staff to understand the impact of command decisions on the battlespace of tomorrow.
This paper will evaluate the feasibility of constructing a system to support intelligence analysts engaged in counter-terrorism. It will discuss the use of emerging techniques to evaluate a large-scale threat data repository (or Infosphere) and comparing analyst developed models to identify and discover potential threat-related activity with a uncertainty metric used to evaluate the threat. This system will also employ the use of psychological (or intent) modeling to incorporate combatant (i.e. terrorist) beliefs and intent. The paper will explore the feasibility of constructing a hetero-hierarchical (a hierarchy of more than one kind or type characterized by loose connection/feedback among elements of the hierarchy) agent based framework or "family of agents" to support "evidence retrieval" defined as combing, or searching the threat data repository and returning information with an uncertainty metric. The counter-terrorism threat prediction architecture will be guided by a series of models, constructed to represent threat operational objectives, potential targets, or terrorist objectives. The approach would compare model representations against information retrieved by the agent family to isolate or identify patterns that match within reasonable measures of proximity. The central areas of discussion will be the construction of an agent framework to search the available threat related information repository, evaluation of results against models that will represent the cultural foundations, mindset, sociology and emotional drive of typical threat combatants (i.e. the mind and objectives of a terrorist), and the development of evaluation techniques to compare result sets with the models representing threat behavior and threat targets. The applicability of concepts surrounding Modeling Field Theory (MFT) will be discussed as the basis of this research into development of proximity measures between the models and result sets and to provide feedback in support of model adaptation (learning). The increasingly complex demands facing analysts evaluating activity threatening to the security of the United States make the family of agent-based data collection (fusion) a promising area. This paper will discuss a system to support the collection and evaluation of potential threat activity as well as an approach fro presentation of the information.
This paper will discuss a simulation approach based upon a family of agent-based models. As the demands placed upon simulation technology by such applications as Effects Based Operations (EBO), evaluations of indicators and warnings surrounding homeland defense and commercial demands such financial risk management current single thread based simulations will continue to show serious deficiencies. The types of “what if” analysis required to support these types of applications, demand rapidly re-configurable approaches capable of aggregating large models incorporating multiple viewpoints. The use of agent technology promises to provide a broad spectrum of models incorporating differing viewpoints through a synthesis of a collection of models. Each model would provide estimates to the overall scenario based upon their particular measure or aspect. An agent framework, denoted as the “family” would provide a common ontology in support of differing aspects of the scenario. This approach permits the future of modeling to change from viewing the problem as a single thread simulation, to take into account multiple viewpoints from different models. Even as models are updated or replaced the agent approach permits rapid inclusion in new or modified simulations. In this approach a variety of low and high-resolution information and its synthesis requires a family of models. Each agent “publishes” its support for a given measure and each model provides their own estimates on the scenario based upon their particular measure or aspect. If more than one agent provides the same measure (e.g. cognitive) then the results from these agents are combined to form an aggregate measure response. The objective would be to inform and help calibrate a qualitative model, rather than merely to present highly aggregated statistical information. As each result is processed, the next action can then be determined. This is done by a top-level decision system that communicates to the family at the ontology level without any specific understanding of the processes (or model) behind each agent. The increasingly complex demands upon simulation for the necessity to incorporate the breadth and depth of influencing factors makes a family of agent based models a promising solution. This paper will discuss that solution with syntax and semantics necessary to support the approach.
Proc. SPIE. 5091, Enabling Technologies for Simulation Science VII
KEYWORDS: Human-machine interfaces, Data modeling, Computer simulations, Associative arrays, Data conversion, Chemical elements, Computer architecture, Systems modeling, Data integration, Standards development
This paper will discuss automated scenario generation (Sgen) techniques to support the development of simulation scenarios. Current techniques for scenario generation are extremely labor intensive, often requiring manual adjustments to data from numerous sources to support increasingly complex simulations. Due to time constraints this process often prevents the simulation of a large numbers of data sets and the preferred level of “what if analysis”. The simulation demands of future mission planning approaches, like Effects Based Operations (EBO), require the rapid development of simulation inputs and multiple simulation runs for those approaches to be effective. This paper will discuss an innovative approach to the automated creation of complete scenarios for mission planning simulation. We will discuss the results of our successful Phase I SBIR effort that validated our approach to scenario generation and refined how scenario generation technology can be directly applied to the types of problems facing EBO and mission planning. The current stovepipe architecture marries a scenario creation capability with each of the simulation tools. The EBO-Scenario generation toolset breaks that connection through an approach centered on a robust data model and the ability to tie mission-planning tools and data resources directly to an open Course Of Action (COA) analysis framework supporting a number of simulation tools. In this approach data sources are accessed through XML tools, proprietary DB structures or legacy tools using SQL and stored as an instance of Sgen Meta Data. The Sgen Meta Data can be mapped to a wide range of simulation tools using a Meta Data to simulation tools mapping editor that generates an XSLT template describing the required data translation. Once the mapping is created, Sgen will automatically convert the Meta Data instance, using XSLT, to the formats required by specific simulation tools. The research results presented in this paper will show how the complex demands of mission planning can be met with current simulation tools and technology.