PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This PDF file contains the front matter associated with SPIE Proceedings Volume 9831, including the Title Page, Copyright information, Table of Contents, Introduction (if any), and Conference Committee listing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
What interoperability is and why the Army wants it between systems is easily understood. Enabling multiple systems to work together and share data across boundaries in a co-operative manner will benefit the warfighter by allowing for easy access to previously hard-to-reach capabilities. How to achieve interoperability is not as easy to understand due to the numerous different approaches that accomplish the goal. Commonality Based Interoperability (CBI) helps establish how to achieve the goal by extending the existing interoperability definition. CBI is not an implementation, nor is it an architecture; it is a definition of interoperability with a foundation of establishing commonality between systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Live sensor data was obtained from an Open Standard for Unattended Sensors (OSUS, formerly Terra Harvest)- based system provided by the Army Research Lab (ARL) and fed into the Communications-Electronics Research, Development and Engineering Center (CERDEC) sponsored Actionable Intelligence Technology Enabled Capabilities Demonstration (AI-TECD) Micro Cloud during the E15 demonstration event that took place at Fort Dix, New Jersey during July 2015. This data was an enabler for other technologies, such as Sensor Assignment to Mission (SAM), Sensor Data Server (SDS), and the AI-TECD Sensor Dashboard, providing rich sensor data (including images) for use by the Company Intel Support Team (CoIST) analyst. This paper describes how the OSUS data was integrated and used in the E15 event to support CoIST operations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Internet of Things (IoT) has come of age and domestic and industrial devices are all “smart”. But how can they be universally classified and queried? How do we know that the underlying architecture is secure enough to deploy on a defense network? By leverage existing platforms designed for interoperability, extensibility, and security that can manage data across multiple domains and runs on any platform.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In 2011, the U.S. Army Research Laboratory (ARL) developed a framework for sensor integration and asset discovery. Because this framework continues to be relevant and necessary, ARL will again participate in Enterprise Challenge 2016 to conduct further experimentation and demonstrations. Incorporating an Expeditionary Processing, Exploitation and Dissemination (Ex-PED) model, ARL will demonstrate the utility of tactical wide-area and persistent sensing in a bandwidth constrained environment, with the inclusion of an effective Sensor 3D Common Operating Picture (COP) to enable appropriate sensor management.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Paul A. Thomas, Gillian Marshall, David Faulkner, Philip Kent, Scott Page, Simon Islip, James Oldfield, Toby P. Breckon, Mikolaj E. Kundegorski, et al.
Proceedings Volume Ground/Air Multisensor Interoperability, Integration, and Networking for Persistent ISR VII, 983108 (2016) https://doi.org/10.1117/12.2229720
Currently, most land Intelligence, Surveillance and Reconnaissance (ISR) assets (e.g. EO/IR cameras) are simply data collectors. Understanding, decision making and sensor control are performed by the human operators, involving high cognitive load. Any automation in the system has traditionally involved bespoke design of centralised systems that are highly specific for the assets/targets/environment under consideration, resulting in complex, non-flexible systems that exhibit poor interoperability. We address a concept of Autonomous Sensor Modules (ASMs) for land ISR, where these modules have the ability to make low-level decisions on their own in order to fulfil a higher-level objective, and plug in, with the minimum of preconfiguration, to a High Level Decision Making Module (HLDMM) through a middleware integration layer. The dual requisites of autonomy and interoperability create challenges around information fusion and asset management in an autonomous hierarchical system, which are addressed in this work. This paper presents the results of a demonstration system, known as Sensing for Asset Protection with Integrated Electronic Networked Technology (SAPIENT), which was shown in realistic base protection scenarios with live sensors and targets. The SAPIENT system performed sensor cueing, intelligent fusion, sensor tasking, target hand-off and compensation for compromised sensors, without human control, and enabled rapid integration of ISR assets at the time of system deployment, rather than at design-time. Potential benefits include rapid interoperability for coalition operations, situation understanding with low operator cognitive burden and autonomous sensor management in heterogenous sensor systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The analysis of warfare frequently suffers from an absence of logical structure for a] specifying explicitly the military mission and b] quantitatively evaluating the mission utility of alternative products and services. In 2003, the Missions and Means Framework (MMF) was developed to redress these shortcomings. The MMF supports multiple combatants, levels of war and, in fact, is a formal embodiment of the Military Decision-Making Process (MDMP). A major effect of incomplete analytic discipline in military systems analyses is that they frequently fall into the category of ill-posed problems in which they are under-specified, under-determined, or under-constrained. Critical context is often missing. This is frequently the result of incomplete materiel requirements analyses which have unclear linkages to higher levels of warfare, system-of-systems linkages, tactics, techniques and procedures, and the effect of opposition forces. In many instances the capabilities of materiel are assumed to be immutable. This is a result of not assessing how platform components morph over time due to damage, logistics, or repair. Though ill-posed issues can be found many places in military analysis, probably the greatest challenge comes in the disciplines of C4ISR supported by ontologies in which formal naming and definition of the types, properties, and interrelationships of the entities are fundamental to characterizing mission success. Though the MMF was not conceived as an ontology, over the past decade some workers, particularly in the field of communication, have labelled the MMF as such. This connection will be described and discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The paper explores the use of correlation across features extracted from different sensing channels to help in urban situational understanding. We use real-world datasets to show how such correlation can improve the accuracy of detection of city-wide events by combining metadata analysis with image analysis of Instagram content. We demonstrate this through a case study on the Singapore Haze. We show that simple ontological relationships and reasoning can significantly help in automating such correlation-based understanding of transient urban events.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Ontologies and semantic systems are necessarily complex but offer great potential in terms of their ability to fuse information from multiple sources in support of situation awareness. Current approaches do not place the ontologies directly into the hands of the end user in the field but instead hide them away behind traditional applications. We have been experimenting with human-friendly ontologies and conversational interactions to enable non-technical business users to interact with and extend these dynamically. In this paper we outline our approach via a worked example, covering: OWL ontologies, ITA Controlled English, Sensor/mission matching and conversational interactions between human and machine agents.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The advance in sensing technologies, the acquisition of new sensors, and use of mobile devices result in the production of an overwhelming amount of sensed data, that augment the challenges for analysts to retrieve relevant information among heterogeneous collected data, and to process and analyze them adequately in a timely manner to better support decision makers. At the same time, the limited quantity and capabilities of intelligence, surveillance and reconnaissance (ISR) resources for the amount of requests for information collection requires to maximize their utilization in order to increase the accuracy of information gain and timely delivery of information. Considering the challenges for ISR intelligence requirements and collection management, as well as for information management and exploitation, the paper describes a unified approach for querying available information sources for enhanced ISR information collection and retrieval. The approach leverages and extends semantic models in the ISR domain. Enhanced ISR assets integration, optimized information collection and management should result in more relevant collected data and improve subsequent analysis.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Sensor-mission assignment involves the allocation of sensors and other information-providing resources to missions in order to cover the information needs of the individual tasks within each mission. The importance of efficient and effective means to find appropriate resources for tasks is exacerbated in the coalition context where the operational environment is dynamic and a multitude of critically important tasks need to achieve their collective goals to meet the objectives of the coalition. The Sensor Assignment to Mission (SAM) framework—a research product of the International Technology Alliance in Network and Information Sciences (NIS-ITA) program—provided the first knowledge intensive resource selection approach for the sensor network domain so that contextual information could be used to effectively select resources for tasks in coalition environments. Recently, CUBRC, Inc. was tasked with operationalizing the SAM framework through the use of the I2WD Common Core Ontologies for the Communications-Electronics Research, Development and Engineering Center (CERDEC) sponsored Actionable Intelligence Technology Enabled Capabilities Demonstration (AI-TECD). The demonstration event took place at Fort Dix, New Jersey during July 2015, and this paper discusses the integration and the successful demonstration of the SAM framework within the AI-TECD, lessons learned, and its potential impact in future operations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
ISR Systems, Information Processing, Management, and Analysis
The challenges for providing war fighters with the best possible actionable information from diverse sensing modalities using advances in big-data and machine learning are addressed in this paper. We start by presenting intelligence, surveillance, and reconnaissance (ISR) related big-data challenges associated with the Third Offset Strategy. Current approaches to big-data are shown to be limited with respect to reasoning/understanding. We present a discussion of what meaning making and understanding require. We posit that for human-machine collaborative solutions to address the requirements for the strategy a new approach, Qualia Exploitation of Sensor Technology (QuEST), will be required. The requirements for developing a QuEST theory of knowledge are discussed and finally, an engineering approach for achieving situation understanding is presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Intelligence, surveillance, and reconnaissance (ISR) operations in urban environments can be particularly challenging due in part to the physical proximity and height of the buildings which can occlude sensor coverage. Operational and laboratory settings have shown it is very difficult for a single operator to manually track a moving target using a set of grounded, steerable sensors within an urban environment. Although computer vision technologies are available for autotracking, they are often unreliable due to variations in lighting, visibility, and visual clutter. As a result, the Air Force Research Laboratory (AFRL) is developing novel interface technologies that leverage automation to flexibly assist a human operator with the task of tracking one or more moving targets across an array of fixed pan-tilt-zoom (PTZ) electro-optical (EO) sensors in an urban environment. Automated functions being explored by this research effort focus on maintaining visual momentum and include automated sensor steering and system-recommended perspective switching. These automated functions were compared, in addition to a baseline (no automation) condition, and operator performance improved as the level of automated assistance increased. The results of this examination indicate a necessity for surveillance technologies to incorporate automation. Further research is recommended to identify additional operator functions that could be automated to overcome the common challenges associated with real-time target tracking in an urban environment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Autonomous underwater vehicles are vehicles that are entirely or partly independent of human decisions. In order to obtain operational independence, the vehicles have to be equipped with a specialized software. The main task of the software is to move the vehicle along a trajectory with collision avoidance. Moreover, the software has also to manage different devices installed on the vehicle board, e.g. to start and stop cameras, sonars etc. In addition to the software embedded on the vehicle board, the software responsible for managing the vehicle by the operator is also necessary. Its task is to define mission of the vehicle, to start, to stop the mission, to send emergency commands, to monitor vehicle parameters, and to control the vehicle in remotely operated mode. An important objective of the software is also to support development and tests of other software components. To this end, a simulation environment is necessary, i.e. simulation model of the vehicle and all its key devices, the model of the sea environment, and the software to visualize behavior of the vehicle. The paper presents architecture of the software designed for biomimetic autonomous underwater vehicle (BAUV) that is being constructed within the framework of the scientific project financed by Polish National Center of Research and Development.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Autonomous Biomimetic Underwater Vehicles BUVs driven by an undulating propulsion are a new branch in an area of an underwater robotics. They imitate both the construction and kinematics of a motion of underwater living organisms, e.g. fishes. Such vehicles have several features crucial from the point of view of military applications, e.g. larger secrecy and potential range of operation. The paper presents results of the research on BUVs carried out within two (Polish and EDA) projects both led by Polish Naval Academy. At the beginning, the initial efforts in building Polish BUV called CyberFish are included. Then, selected results of the tests of subsystems, e.g. navigational and 3D model of BUV built within national project are described. Next, the initial research achieved in the international project are showed. At the end, the schedule of the research planned to carry out within both projects is inserted. The paper is mainly focused on the hardware development of the BUVs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Dual Node Decision Wheels (DNDW) architecture concept was previously described as a novel approach toward integrating analytic and decision-making processes in joint human/automation systems in highly complex sociotechnical settings. In this paper, we extend the DNDW construct with a description of components in this framework, combining structures of the Dual Node Network (DNN) for Information Fusion and Resource Management with extensions on Rasmussen’s Decision Ladder (DL) to provide guidance on constructing information systems that better serve decision-making support requirements. The DNN takes a component-centered approach to system design, decomposing each asset in terms of data inputs and outputs according to their roles and interactions in a fusion network. However, to ensure relevancy to and organizational fitment within command and control (C2) processes, principles from cognitive systems engineering emphasize that system design must take a human-centered systems view, integrating information needs and decision making requirements to drive the architecture design and capabilities of network assets. In the current work, we present an approach for structuring and assessing DNDW systems that uses a unique hybrid DNN top-down system design with a human-centered process design, combining DNN node decomposition with artifacts from cognitive analysis (i.e., system abstraction decomposition models, decision ladders) to provide work domain and task-level insights at different levels in an example intelligence, surveillance, and reconnaissance (ISR) system setting. This DNDW structure will ensure not only that the information fusion technologies and processes are structured effectively, but that the resulting information products will align with the requirements of human decision makers and be adaptable to different work settings .
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Global Information Network Architecture is an information technology based on Vector Relational Data Modeling, a unique computational paradigm, DoD network certified by USARMY as the Dragon Pulse Informa- tion Management System. This network available modeling environment for modeling models, where models are configured using domain relevant semantics and use network available systems, sensors, databases and services as loosely coupled component objects and are executable applications. Solutions are based on mission tactics, techniques, and procedures and subject matter input. Three recent ARMY use cases are discussed a) ISR SoS. b) Modeling and simulation behavior validation. c) Networked digital library with behaviors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Decision Making: Joint Session with conferences 9831 and 9851
Data collection processes supporting Intelligence, Surveillance, and Reconnaissance (ISR) missions have recently undergone a technological transition accomplished by investment in sensor platforms. Various agencies have made these investments to increase the resolution, duration, and quality of data collection, to provide more relevant and recent data to warfighters. However, while sensor improvements have increased the volume of high-resolution data, they often fail to improve situational awareness and actionable intelligence for the warfighter because it lacks efficient Processing, Exploitation, and Dissemination and filtering methods for mission-relevant information needs. The volume of collected ISR data often overwhelms manual and automated processes in modern analysis enterprises, resulting in underexploited data, insufficient, or lack of answers to information requests. The outcome is a significant breakdown in the analytical workflow. To cope with this data overload, many intelligence organizations have sought to re-organize their general staffing requirements and workflows to enhance team communication and coordination, with hopes of exploiting as much high-value data as possible and understanding the value of actionable intelligence well before its relevance has passed. Through this effort we have taken a scholarly approach to this problem by studying the evolution of Processing, Exploitation, and Dissemination, with a specific focus on the Army’s most recent evolutions using the Functional Resonance Analysis Method. This method investigates socio-technical processes by analyzing their intended functions and aspects to determine performance variabilities. Gaps are identified and recommendations about force structure and future R and D priorities to increase the throughput of the intelligence enterprise are discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Dual Node Decision Wheels (DNDW) architecture is a new approach to information fusion and decision support systems. By combining cognitive systems engineering organizational analysis tools, such as decision trees, with the Dual Node Network (DNN) technical architecture for information fusion, the DNDW can align relevant data and information products with an organization’s decision-making processes. In this paper, we present the Compositional Inference and Machine Learning Environment (CIMLE), a prototype framework based on the principles of the DNDW architecture. CIMLE provides a flexible environment so heterogeneous data sources, messaging frameworks, and analytic processes can interoperate to provide the specific information required for situation understanding and decision making. It was designed to support the creation of modular, distributed solutions over large monolithic systems. With CIMLE, users can repurpose individual analytics to address evolving decision-making requirements or to adapt to new mission contexts; CIMLE’s modular design simplifies integration with new host operating environments. CIMLE’s configurable system design enables model developers to build analytical systems that closely align with organizational structures and processes and support the organization’s information needs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The U.S. Army Research Laboratory (ARL) and McQ Inc. are developing a generic sensor fusion architecture that involves several diverse processes working in combination to create a dynamic task-oriented, real-time informational capability. Processes include sensor data collection, persistent and observational data storage, and multimodal and multisensor fusion that includes the flexibility to modify the fusion program rules for each mission. Such a fusion engine lends itself to a diverse set of sensing applications and architectures while using open-source software technologies. In this paper, we describe a fusion engine architecture that combines multimodal and multi-sensor fusion within an Open Standard for Unattended Sensors (OSUS) framework. The modular, plug-and-play architecture of OSUS allows future fusion plugin methodologies to have seamless integration into the fusion architecture at the conceptual and implementation level. Although beyond the scope of this paper, this architecture allows for data and information manipulation and filtering for an array of applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We address the problem of detecting insider threats before they can do harm. In many cases, co-workers notice indications of suspicious activity prior to insider threat attacks. A partial solution to this problem requires an understanding of how information can better traverse the communication network between human intelligence and insider threat analysts. Our approach employs modern mobile communications technology and scale free network architecture to reduce the network distance between human sensors and analysts. In order to solve this problem, we propose a Vector Relational Data Modeling approach to integrate human “sensors,” geo-location, and existing visual analytics tools. This integration problem is known to be difficult due to quadratic increases in cost associated with complex integration solutions. A scale free network integration approach using vector relational data modeling is proposed as a method for reducing network distance without increasing cost.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.