PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This Pdf file contains the Front Matter associated with SPIE Proceedings Volume 7694, including Title page, Copyright information, Table of Contents, Conference Committee listing, and Introduction, if any.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Ultimately, the success of any persistent ISR system will be judged by the quality (timeliness, accuracy and provenance)
of the intelligence products that it delivers. In deploying multiple sensors to gather intelligence there is frequently a
tripartite trade-off to be made between the physical constraints imposed by the sensor and platform performance both
against the requirements of that mission and against the information needs of other users. Thus there is a need when
working with constrained resources to optimise deployment through intelligent tasking to maximise the information
quality without contradictory or over-constraining requirements and whilst maintaining mission efficiency.
This paper considers recent advancements in defining mission specifications to better facilitate the optimum deployment
of sensors against competing requirements and the needs of different missions. Considerations will be based against a
scenario of a number of airborne vehicles carrying heterogeneous imaging sensors tasked for mine detection missions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Within the context of C4ISTAR information "systems of systems", we discuss sensor data fusion aspects that are aiming at the generation of higher-level in-formation according to the JDL model of data fusion. In particular, two issues are addressed: (1) Tracking-derived Situation Elements: Standard target tracking applications gain information related to 'Level 1 Fusion' according to the well-established terminology of the JDL model. Kinematic data of this type, however, are by no means the only information to be derived from tar-get tracks. In many cases, reliable and quantitative higher level information according to the JDL terminology can be obtained. (2) Anomaly Detection in Tracking Data Bases: Anomaly detection can be regarded as a process of information fusion that aims at focusing the attention of human decision makers or decision making systems is focused on particular events that are "irregular" or may cause harm and thus require special actions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Within a coalition environment, ad hoc Communities of Interest (CoI's) come together, perhaps for only a short time,
with different sensors, sensor platforms, data fusion elements, and networks to conduct a task (or set of tasks) with
different coalition members taking different roles. In such a coalition, each organization will have its own inherent
restrictions on how it will interact with the others. These are usually stated as a set of policies, including security and
privacy policies. The capability that we want to enable for a coalition operation is to provide access to information from
any coalition partner in conformance with the policies of all. One of the challenges in supporting such ad-hoc coalition
operations is that of providing efficient access to distributed sources of data, where the applications requiring the data do
not have knowledge of the location of the data within the network. To address this challenge the International
Technology Alliance (ITA) program has been developing the concept of a Dynamic Distributed Federated Database
(DDFD), also know as a Gaian Database. This type of database provides a means for accessing data across a network of
distributed heterogeneous data sources where access to the information is controlled by a mixture of local and global
policies. We describe how a network of disparate ISR elements can be expressed as a DDFD and how this approach
enables sensor and other information sources to be discovered autonomously or semi-autonomously and/or combined,
fused formally defined local and global policies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The management of sensor networks in coalition settings has been treated in a piecemeal fashion in the current literature
without taking a comprehensive look at the complete life cycle of coalition networks, and determining the different
aspects of network management that need to be taken into account for the management of sensor networks in those
contexts. In this paper, we provide a holistic approach towards managing sensor networks encountered in the context of
coalition operations. We describe how the sensor networks in a coalition ought to be managed at various stages of the
life cycle, and the different operations that need to be taken into account for managing various aspects of the networks.
In particular, we look at the FCAPS model for network management, and assess the applicability of the FCAPS model
to the different aspects of sensor network management in a coalition setting.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Recent experience has demonstrated that adversary activities are difficult to distinguish from background activity. In
order to see hidden activities, it is proposed that a combination of new sensing modalities and better ways of processing
existing modalities is required. We have applied a robust methodology to analyse the strengths and weaknesses of
sensor types at detecting and characterising adversary activities. This has revealed both complementary and synergistic
relationships between sensor types, supporting the hypothesis that judicious combining of data from multiple sensors will
result in a sensing capability significantly greater than that achievable by individual sensors. The challenge to making
this capability a reality is to develop and integrate automatic processing techniques to self-cue sensors and fuse their
information, whilst avoiding additional burden on the users. To facilitate evolution and wide exploitation of this
capability, we are developing an open, scalable architecture in which to integrate the sensors and processing. All of this
work is now being taken forward in the UK's PWAS (Persistent Wide Area Surveillance) S2O demonstrator project
which is working towards a rapid demonstration of the capability benefits of an integrated multiple sensor system. The
immediate goal is to integrate mature equipment into a demonstrator system, as a test-bed for current and developing
cueing/fusion processing algorithms. The near-term goal is to evolve the capability through technology updates which
exploit new sensors and improved sensor processing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this paper we present a Generic Vehicle Architecture (GVA), developed as part of the UK MOD GVA programme that addresses the issues of dynamic platform re-role through modular capability integration and behaviour orchestration. The proposed architecture addresses the need for: a) easy integration with legacy and
future systems, and architectures; b) scalability from individual sensors, individual human users, vehicles and patrols
to battle groups and brigades; c) rapid introduction of new capabilities in response to a changing operational
scenario; d) be agnostic of communications systems, devices, operating systems and computer platforms. The
GVA leverages the use of research output and tools developed by the International Technology Alliance (ITA)
in Network and Information Science1 programme, in particular the ITA Sensor Fabric2-4 developed to address
the challenges in the areas of sensor identification, classification, interoperability and sensor data sharing, dissemination and consumability, commonly present in tactical ISR/ISTAR,5 and the Gaian Dynamic Distributed
Federated Database (DDFD)6-8 developed the challenges of accessing distributed sources of data in an ad-hoc
environment where the consumers do not have the knowledge of the location of the data within the network.
The GVA also promotes the use of off-the-shelf hardware, and software which is advantageous from the aspect of
easy of upgrading, lower cost of support and replacement, and speed of re-deploying platforms through a "fitted
for but not with" approach. The GVA exploits the services orientated architecture (SOA) environment provided
by the ITA Sensor Fabric to enhance the capability of legacy solutions and applications by enabling information
exchange between them by, for example, providing direct near real-time communication between legacy systems.
The GVA, a prototype implementation demonstrator of this architecture has demonstrated its utility to fusing,
exploiting and sharing situational awareness information for force protection, and platform and device health
and usage information for logistics and deployment management.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We consider the problem of deploying mobile robots to create a mutually-visible formation between stationary and
mobile targets in a known environment. A mutually-visible formation is a placement where each agent or target is
connected to all others through a sequence of visibility pairings. Mutual visibility enhances radio communications
links, and enables other modalities such as optical communications. We discretize the environment in a manner
conducive to visibility calculations, and, as targets shift, we use dynamic programming to find formations that
preserve the visibility topology and minimize movement.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Substantial research has addressed the problems of automatic search, routing, and sensor tasking for UAVs,
producing many good algorithms for each task. But UAV surveillance missions typically include combinations of
these tasks, so an algorithm that can manage and control UAVs through multiple tasks is desired. The algorithm
in this paper employs a cooperative graph-based search when target states are unknown. If target states become
more localized, the algorithm switches to route UAV(s) for target intercept. If a UAV is close to a target,
waypoints and sensor commands are optimized over short horizons to maintain the best sensor-to-target viewing
geometry.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In future military missions, there will be many sensor assets collecting much important information about
the environment. User control over surveillance assets is important to ensure that the specific data collected
is appropriate for the current mission. Unfortunately, previous work has shown that individual users cannot
effectively control more than about four assets, even if the assets have significant autonomy. In the ACCAST
project, we hypothesized that by including autonomous teamwork between the assets and allowing users to
interact by describing what the team as a whole and specific sub-teams should do, we could dramatically scale
up the number of assets an individual user could effectively control. In this paper, we present the results of
an experiment where users controlled up to 30 autonomous assets performing a complex mission. The assets
autonomously worked together using sophisticated teamwork and the user could tell sub-teams to execute team
oriented plans which described the steps required to achieve a team objective without describing exactly which
asset performed which role and without having to specify how the team should handle routine information
sharing, communications and failure circumstances. The users, soldiers from Fort Benning, were surprisingly
good at managing the assets and were all able to complete the complex mission with extremely low friendly and
civilian casualties.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Environmental Awareness for Sensor and Emitter Employment (EASEE) is a flexible, object-oriented software design
for predicting environmental effects on the performance of battlefield sensors and detectability of signal emitters. Its
decision-support framework facilitates many sensor and emitter modalities and can be incorporated into battlespace
command and control (C2) systems. Other potential applications include immersive simulation, force-on-force
simulation, and virtual prototyping of sensor systems and signal-processing algorithms. By identifying and encoding
common characteristics of Army problems involving multimodal signal transmission and sensing into a flexible software
architecture in the Java programming language, EASEE seeks to provide an application interface enabling rapid
integration of diverse signal-generation, propagation, and sensor models that can be implemented in many client-server
environments. Its explicit probabilistic modeling of signals, systematic consideration of many complex environmental
and mission-related factors affecting signal generation and propagation, and computation of statistical metrics
characterizing sensor performance facilitate a highly flexible approach to signal modeling and simulation. EASEE aims
to integrate many disparate statistical formulations for modeling and processing many types of signals, including
infrared, acoustic, seismic, radiofrequency, and chemical/biological. EASEE includes objects for representing sensor
data, inferences for target detection and/or direction, signal transmission and processing, and state information (such as
time and place). Various transmission and processing objects are further grouped into platform objects, which fuse data
to make various probabilistic predictions of interest. Objects representing atmospheric and terrain environments with
varying degrees of fidelity enable modeling of signal generation and propagation in diverse and complex environments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The purpose of the CityBeat @ Tec^Edge program is to improve urban situation awareness through the integration,
visualization and exploitation of geospatial imagery and products with sociocultural information in a layered sensing
architecture. CityBeat applies persistent surveillance from multiple sensors to include wide area airborne and ground
level cameras to learn normal behavior patterns based on object motion. Publicly available GIS and sociocultural
datasets are integrated to provide context for the direct sensor measurements. Anomaly detection algorithms
incorporating normalcy models with observed behavior are being developed to automatically alert an analyst of unusual
behavior for objects of interest.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Department of Defense (DoD) has established an Unattended Ground Sensor (UGS) Standards Working Group to
address the interoperability of UGS, promote competition, provide enhanced capabilities, and support UGS missions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Unattended Ground Sensor (UGS) systems have special requirements for long-duration, low-power operation,
exfiltration of sensor reports and imagery over intermittent terrestrial or satellite communications channels, sensor
description, management, discovery, configuration and command-and-control. This paper surveys a number existing and
proposed software architectures for networked sensors, to include publish/subscribe brokered frameworks, with respect
to the specific features needed for standards and protocols for UGS interoperability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The combination of changing technology in the marketplace and new requirements for UGS will provide a continuing
force for improving the performance of UGS systems in the future. The characterization of UGS as a System has already
transformed UGS from an individual sensor reporting a target detection to a user in proximity to the sensor into the
current paradigm of many UGS units interacting over a network to provide much more target information. As the
technology continues to move forward, the amount of target information and the precision of the information are going
to advance. This paper will provide examples of research underway to meet these future UGS needs and estimates of
when these advances will become deployable systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Unattended Ground Sensors (UGS) from a wide range of manufacturers have difficulty interoperating with each other
and common control and dissemination points. Typically, sensor data is transmitted via RF or wired connections to a
central location where the data can be fused together and transmitted further via satellite to a Processing, Exploitation
and Dissemination (PED) system. These PED's are charged with analyzing the data to create real time actionable
intelligence for the war fighter. However, when several disparate sensors from different manufacturers are used,
interoperability problems arise. Therefore, a central UGS controller that accepts data from a wide range of sensors and
helps them interoperate is essential. This paper addresses benefits derived from using the Open Geospatial Consortium's
(OGC) Sensor Model Language (SensorML) sensor descriptions for an UGS controller. SensorML 1.0 is an approved
OGC standard and is one of the major components within the OGC Sensor Web Enablement (SWE) suite of standards.
SensorML provides standard models and an XML encoding for describing any process, including the process of
measurement by sensors. By incorporating SensorML, an UGS controller can accept data from various sensors from
different manufacturers, and interpret that data with the SensorML descriptions to allow the controller to take
programmed actions and interoperate between sensors. Furthermore, SensorML can be used to translate the native sensor
formats once the original data has been transmitted to the PED. Therefore, this makes a SensorML enabled UGS
controller an extremely powerful tool that provides situational awareness by combining multiple sensors to form a single
common operational picture (COP).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present an architecture for layered sensing which is constructed on open source and government off-the-shelf
software. This architecture shows how leveraging existing open-source software allows for practical graphical user
interfaces along with the underlying database and messaging architecture to be rapidly assimilated and utilized in real-world
applications. As an example of how this works, we present a system composed of a database and a graphical user
interface which can display wide area motion imagery, ground-based sensor data and overlays from narrow field of view
sensors in one composite image composed of sensor data and other metadata in separate layers on the display. We further
show how the development time is greatly reduced by utilizing open-source software and integrating it into the final
system design. The paper describes the architecture, the pros and cons of the open-source approach with results for a
layered sensing application with data from multiple disparate sensors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
"The foundation for integrating ISR planning and direction is the information network, including the appropriate ISR
services and applications oriented toward the [commanders] needs. By combining global visibility of available
information and intelligence needs with the tools to maximize platform/sensor/target management, the network will
improve efficiency and maximize persistence. Inherent within this concept is the idea of integrating and synchronizing a
mix of sensing systems and platforms rather than relying on a single system. The second concept embedded within this
concept is the ability to capture the activity/information as it occurs rather than forensically reconstructing after the fact.
This requires the ability for the [commander] to adjust collection priorities of the entire collection suite to a level
appropriate to the activity of interest. Individual sensors, platforms and exploitation nodes will become more efficient as
part of an integrated system. Implementing this fully integrated ISR Enterprise will result in improved persistence, and
ultimately better ISR for the warfighter."[3]
Over the last 6 years, SAIC has been working with CERDEC and AMRDEC to introduce Battle Command aids
supporting (semi) autonomous execution and collaboration of unmanned assets. This paper presents an operational
context and a distributed command and control architecture aiming to reduce workload and increase Persistent ISR
effectiveness. This architecture has been implemented and demonstrated in field tests and as part of FY'09 C4ISR OTM
testbed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Sensor Networks and Communications: Joint Session with Conference 7707
Teledyne Scientific Company, the University of California at Santa Barbara (UCSB) and the Army Research Lab
are developing technologies for automated data exfiltration from heterogeneous sensor networks through the Institute
for Collaborative Biotechnologies (ICB). Unmanned air vehicles (UAV) provide an effective means to autonomously
collect data from unattended ground sensors (UGSs) that cannot communicate with each other. UAVs are used to
reduce the system reaction time by generating autonomous data-driven collection routes. Bio-inspired techniques for
search provide a novel strategy to detect, capture and fuse data across heterogeneous sensors. A fast and accurate
method has been developed for routing UAVs and localizing an event by fusing data from a sparse number of UGSs; it
leverages a bio-inspired technique based on chemotaxis or the motion of bacteria seeking nutrients in their environment.
The system was implemented and successfully tested using a high level simulation environment using a flight simulator
to emulate a UAV. A field test was also conducted in November 2009 at Camp Roberts, CA using a UAV provided by
AeroMech Engineering. The field test results showed that the system can detect and locate the source of an acoustic
event with an accuracy of about 3 meters average circular error.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Determination of an optimal configuration (numbers, types, and locations) of a sensor network is an important practical
problem. In most applications, complex signal propagation effects and inhomogeneous coverage preferences lead to an
optimal solution that is highly irregular and nonintuitive. The general optimization problem can be strictly formulated as
a binary linear programming problem. Due to the combinatorial nature of this problem, however, its strict solution
requires significant computational resources (NP-complete class of complexity) and is unobtainable for large spatial
grids of candidate sensor locations. For this reason, a greedy algorithm for approximate solution was recently introduced
[S. N. Vecherin, D. K. Wilson, and C. L. Pettit, "Optimal sensor placement with terrain-based constraints and signal
propagation effects," Unattended Ground, Sea, and Air Sensor Technologies and Applications XI, SPIE Proc. Vol. 7333,
paper 73330S (2009)]. Here further extensions to the developed algorithm are presented to include such practical needs
and constraints as sensor availability, coverage by multiple sensors, and wireless communication of the sensor
information. Both communication and detection are considered in a probabilistic framework. Communication signal and
signature propagation effects are taken into account when calculating probabilities of communication and detection.
Comparison of approximate and strict solutions on reduced-size problems suggests that the approximate algorithm yields quick and good solutions, which thus justifies using that algorithm for full-size problems. Examples of three-dimensional outdoor sensor placement are provided using a terrain-based software analysis tool.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We discuss the development, design, implementation, and demonstration of a robotic UGV (Unmanned Ground Vehicle)
system for networked and non-line-of-sight sensing applications. Our development team is comprised of AFRL Summer
Interns, University Faculty, and Personnel from AFRL. The system concept is based on a previously published technique
known as "Dual-UAV Tandems for Indirect Operator-Assisted Control" [1]. This architecture is based on simulating a
Mini-UAV Helicopter with a building-mounted camera and simulating a low-flying QuadRotor Helicopter with a
Robotics UGV. The Robotics UGV is fitted with a custom-designed sensor boom and a surrogate chem/bio (Carbon
Monoxide) PCB sensor extracted from a COTS (Commercial-Off-The-Shelf) product. The CO Sensor apparatus is co-designed
with the sensor boom and is fitted with a transparent covering for protection and to promote CO (surrogate
chem/bio) flow onto the sensor.
The philosophy behind this non-line-of-sight system is to relay video of the UGV to an Operator station for purposes of
investigating "Indirect Operator-Assisted Control" of the UGV via observation of the relayed EO video at the operator
station. This would serve as a sensor fusion, giving the operator visual cues of the chemical under detection, enabling
him to position the UGV in areas of higher concentration. We recorded this data, and analyzed the best approach given a
test matrix of multiple scenarios, with the goal of determining the feasibility of using this layered sensing approach and
the system accuracy in open field tests.
For purposes of collecting scientific data with this system, we developed a Test (data collection) Matrix with following
three parameters: 1. Chem/Bio detection level with side-looking sensor boom and slowly traversing UGV; 2. Chem/Bio
detection level with panning sensor boom and slowly traversing UGV; 3. Chem/Bio detection level with forward-looking
sensor boom and operator-assisted steering based on onboard wind vane readings of UGV display that is
overlayed onto relayed video. In addition to reporting the trends and results of analysis with regard to data collected with
this Test Matrix, we discuss potential approaches to upgrading our networked robotics UGV system and also introduce
the concept of "swapping sensors" with this low-cost networked sensor concept.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
L-3 Nova's mNet family of miniature networked transceivers provides high data rate (500 kbps) networked
connectivity. The mNet provides coverage from 300 MHz to 2.48 GHz, covering frequencies ideally suited for ground
propagation. mNet offers low power sleep modes and full software control of data rates, modulation settings and RF
power levels. The on-board LNA and PA can be enabled to increase range or disabled to conserve power. The mNet
utilizes ad-hoc mesh networking to form a collaborative sensor network for distributed processing. L-3 Nova offers a
variety of mNet implementation options; some units are no larger than a coin.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The detection and localization of weapon firing on the battlefield is envisaged by means of acoustic waves. The main
objective of this work is to compare various sensing elements that can be integrated in acoustic arrays. Experimental
measurements of sound waves obtained by using some of these elements in Unattended Ground Sensors are presented
for snipers, mortars or artillery guns. The emphasis will be put on the characteristics of the sensing elements needed to
detect and classify the Mach wave generated by a supersonic projectile and the muzzle wave generated by the
combustion of the propulsion powder.
Examples of preliminary prototypes are presented to illustrate our topic. We will concentrate on a wearable system
considered to improve the soldier's awareness of the surrounding threats: this realization consists of a network of three
helmets integrating an acoustic array for the detection and localization of snipers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
There exists a current need to rapidly and accurately identify the presence and location of optical imaging devices used
in counter-surveillance activities against U. S. troops deployed abroad. The locations of devices employed in counter-surveillance
activities can be identified through detection of the optically augmented reflection from these devices. To
address this need, we have developed a novel optical augmentation sensor, the Mobile Optical Detection System
(MODS), which is uniquely designed to identify the presence of optical systems of interest. The essential components of
the sensor are three, spectrally diverse diode lasers (1 ultraviolet/2 near-infrared) which are integrated to produce a single
multi-wavelength interrogation beam and a charge-coupled-device (CCD) receiver which is used to detect the
retroreflected, optical beam returned from a target of interest. The multi-spectral diode laser illuminator and digital
receiver are configured in a pseudo-monostatic arrangement and are controlled through a customized computer interface.
By comparison, MODS is unique among OA sensors since it employs a collection of wavelength-diverse, continuous-wave
(CW) diode laser sources which facilitate the identification of optical imaging devices used for counter-surveillance
activities. In addition, digital image processing techniques are leveraged to facilitate improved clutter
rejection concomitant with highly-specific target location (e.g., azimuth and elevation). More, the digital output format
makes the sensor amenable to a wide range of interface options including computer networks, eyepieces and remotely-located
displays linked through wireless nodes.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Utility-based cross-layer optimization is a valuable tool for resource management in mission-oriented wireless
sensor networks (WSN). The benefits of this technique include the ability to take application- or mission-level
utilities into account and to dynamically adapt to the highly variable environment of tactical WSNs. Recently,
we developed a family of distributed protocols which adapts the bandwidth and energy usage in mission-oriented
WSN in order to optimally allocate resources among multiple missions, that may have specific demands depending
on their priority, and also variable schedules, entering and leaving the network at different times.9-12 In this
paper, we illustrate the practical applicability of this family of protocols in tactical networks by implementing one
of the protocols, which ensures optimal rate adaptation for congestion control in mission-oriented networks,9 on a
real-time 802.11b network using the ITA Sensor Fabric.13 The ITA Sensor Fabric is a middleware infrastructure,
developed as part of the International Technology Alliance (ITA) in Network and Information Science,14 to
address the challenges in the areas of sensor identification, classification, interoperability and sensor data sharing,
dissemination and consumability, commonly present in tactical WSNs.15 Through this implementation, we (i)
study the practical challenges arising from the implementation and (ii) provide a proof of concept regarding
the applicability of this family of protocols for efficient resource management in tactical WSNs amidst the
heterogeneous and dynamic sets of sensors, missions and middle-ware.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Army Research Laboratory is in partnership with the University of Florida - Electronics Communications
Laboratory to develop compact radar technology and demonstrate that it is scalable to a variety of ultra-lightweight
platforms (<10 lbs.) to meet Army mission needs in persistent surveillance, unattended ground sensor (UGS),
unmanned systems, and man-portable sensor applications. The advantage of this compact radar is its steerable beam
technology and relatively long-range capability compared to other small, battery-powered radar concepts. This
paper will review the ongoing development of the sensor and presents a sample of the collected data thus far.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper details the continued development of a modularized system level model of a sparse detector
sensor system. The assumptions used to simplify the equations describing the effects of individual system
components and characteristics such as target to background properties, collection optics, detectors, and
classifiers will be detailed and modeled. These individual effects will then be combined to provide an
overall system performance model and used to compare two sensor node designs.
The model will facilitate design trade offs for Unattended Ground Sensors. The size and power restrictions
of these sensors often preclude these sensors from being effective in high-resolution applications such as
target identification. However, these systems are well suited for applications such as broad scale
classifications or differentiations between targets such as humans, animals or small vehicles. Therefore, the
demand for these sensors is increasing for both the military and homeland security.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper provides a feasibility analysis and details of implementing a classification algorithm on an embedded
controller for use with a profiling sensor. Such a profiling sensor has been shown to be a feasible approach to a low-cost
persistent surveillance sensor for classifying moving objects such as humans, animals, or vehicles. The sensor produces
data that can be used to generate object profiles as crude images or silhouettes, and/or the data can be subsequently
automatically classified. This paper provides a feasibility analysis of a classification algorithm implemented on an
embedded controller, which is packaged with a prototype version of a profiling sensor. Implementation of the embedded
controller is a necessary extension of previous work for fielded profiling sensors and their appropriate applications.
Field data is used to confirm accurate automated classification.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper describes ongoing research by Georgia Tech into the challenges of tasking and controlling
heterogonous teams of unmanned vehicles in mixed indoor/outdoor reconnaissance scenarios. We outline the tools and
techniques necessary for an operator to specify, execute, and monitor such missions. The mission specification
framework used for the purposes of intelligence gathering during mission execution are first demonstrated in simulations
involving a team of a single autonomous rotorcraft and three ground-based robotic platforms. Preliminary results
including robotic hardware in the loop are also provided.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In order to anticipate the constantly changing landscape of global warfare, the United States Air Force must acquire new
capabilities in the field of Intelligence, Surveillance, and Reconnaissance (ISR). To meet this challenge, the Air Force
Research Laboratory (AFRL) is developing a unifying construct of "Layered Sensing" which will provide military
decision-makers at all levels with the timely, actionable, and trusted information necessary for complete battlespace
awareness. Layered Sensing is characterized by the appropriate combination of sensors and platforms (including those
for persistent sensing), infrastructure, and exploitation capabilities to enable this synergistic awareness.
To achieve the Layered Sensing vision, AFRL is pursuing a Modeling & Simulation (M&S) strategy through the
Layered Sensing Operations Center (LSOC). An experimental ISR system-of-systems test-bed, the LSOC integrates
DoD standard simulation tools with commercial, off-the-shelf video game technology for rapid scenario development
and visualization. These tools will help facilitate sensor management performance characterization, system
development, and operator behavioral analysis. Flexible and cost-effective, the LSOC will implement a non-proprietary,
open-architecture framework with well-defined interfaces. This framework will incentivize the transition of current ISR
performance models to service-oriented software design for maximum re-use and consistency. This paper will present
the LSOC's development and implementation thus far as well as a summary of lessons learned and future plans for the
LSOC.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A current trend that is gaining strength in the wireless sensor network area is the use of heterogeneous sensor nodes in
one coordinated overall network, needed to fulfill the requirements of sophisticated emerging applications, such as area
surveillance systems. One of the main concerns when developing such sensor networks is how to provide coordination
among the heterogeneous nodes, in order to enable them to efficiently respond the user needs. This study presents an
investigation of strategies to coordinate a set of static sensor nodes on the ground cooperating with wirelessly connected
Unmanned Aerial Vehicles (UAVs) carrying a variety of sensors, in order to provide efficient surveillance over an area
of interest. The sensor nodes on the ground are set to issue alarms on the occurrence of a given event of interest, e.g.
entrance of a non-authorized vehicle in the area, while the UAVs receive the issued alarms and have to decide which of
them is the most suitable to handle the issued alarm. A bio-inspired coordination strategy based on the concept of
pheromones is presented. As a complement of this strategy, a utility-based decision making approach is proposed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this paper we present a new approach to the real-time generation and dissemination of steerable video chips from
large volume motion imagery streams. Traditional large frame motion imagery streaming and dissemination systems
employ JPEG 2000 (J2K) compression and associated JPEG 2000 Interactive Protocol (JPIP) streaming to encode and
deliver images over varying bandwidth communication channels. While J2K and JPIP technologies are suitable for many
large frame motion imagery applications, they often struggle to satisfy the needs of certain low power, low bandwidth
users. The J2K format does not currently support inter-frame compression and, therefore, cannot target the lowest
bandwidth motion imagery users. Additionally, J2K decompression and JPIP processing both consume more
computational resources than low-end client systems often have available. This is especially true for handheld and thin-client
devices. We address these issues by integrating region-of-interest J2K compression and JPIP streaming with
MPEG-2 and H.264 video compression technology, taking advantage of the ubiquitous hardware acceleration and client
ingest support for these full motion video product formats. The proposed architecture maintains all the benefits of
incorporating a J2K archival format, while also boasting the ability to disseminate J2K regions-of-interest and low
resolution overviews to an even greater number of simultaneous clients. We illustrate a real-time integration and
implementation of these technologies and show how they can be used to enable interactive and automated tracking and
dissemination of multiple moving objects from wide area persistent surveillance motion imagery.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
As the number of sensing assets and tactical command and control (C2) systems grow, the need for a centralized means
of collecting and disseminating the crucial information grows as well. Over the past 5 years, Honeywell has created a
software application known as the Network-Enabled Operator Station (NEOS) to answer this need. NEOS has been
developed from the ground up to be an open-architecture solution which integrates a variety of assets, communications
systems and protocols, and data sharing techniques. The ultimate goals are to increase friendly situational awareness and
increase the effectiveness of field operators.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Multiple sensors with multiple modalities are being routinely deployed in forward areas to gain the situational
awareness. Some of the sensors are activity detection sensors such as acoustic, seismic, passive infrared (PIR), and
magnetic sensors which normally consume low power. These sensors often cue or wake up more power hungry sensors
such as imaging sensors, namely visible camera and infrared camera, and radar to either capture a picture or to track a
target of interest. Several airborne sensors routinely gather information on an area of interest using radar, imaging
sensors for intelligence, surveillance and reconnaissance (ISR) purposes. Recently, Empire Challenge has brought a new
concept: that is, harvesting the ISR data from the remotely distributed unattended ground sensors. Here aerial vehicle
flies by the area occasionally and queries if the sensors have any data to be harvested. Harvesting large amounts of data
is unnecessary and impractical - so some amount of fusion of the sensor data is essential.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In the modern networked battlefield, network centric warfare (NCW) scenarios need to interoperate between shared
resources and data assets such as sensors, UAVs, satellites, ground vehicles, and command and control (C2/C4I)
systems. By linking and fusing platform routing information, sensor exploitation results, and databases (e.g. Geospatial
Information Systems [GIS]), the shared situation awareness and mission effectiveness will be improved. Within the
information fusion community, various research efforts are looking at open standard approaches to composing the
heterogeneous network components under one framework for future modeling and simulation applications. By utilizing
the open source services oriented architecture (SOA) based sensor web services, and GIS visualization services, we
propose a framework that ensures the fast prototyping of intelligence, surveillance, and reconnaissance (ISR) system
simulations to determine an asset mix for a desired mission effectiveness, performance modeling for sensor
management and prediction, and user testing of various scenarios.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Unattended Ground Sensors have found widespread usefulness in force and asset protection, border patrol and, drug
enforcement. In recent years their application has extended into ground and air surveillance providing additional data
from disparate networked Intelligence, Surveillance, and Reconnaissance resources. The consolidation of this data and
effective presentation through software applications efficiently communicates critical information that helps the analyst
support persistent surveillance missions. This paper presents the interface such flexible applications with an emphasis on
their presentation elements and information content.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The current bottleneck in wide area persistent surveillance missions is slow exploitation and analysis (real-time and forensic)by human analysts. We are currently developing an automated data exploitation system that can detect, track, and recognize targets and threats using computer vision. Here we present results from a newly developed target detection process. Depanding on target size, target detection can be divided in three detection classes: unresolved targets, small extended targets, and large extended targets. The Matched Filter (MF) method is currently a popular approach for unresolved target detection using IR focal plane arrays and EO (CCD) cameras and sensor detectors. The MF method is much more difficult to apply to to the extended target classes, since many different matched filters are needed to match the different target shapes and intensity profiles that can exist. The MF method does not adequately address non-fixed target shapes (e.g. walking or running human). We have developed an approach for robust target detection that can detect targets of different sizes and shapes (fixed/non-fixed) using a combination of image frame time-differencing, deep-thresholding, and target shape and size analysis with non-linear morphologial operations. Applications for gound vehicle detection under heavy urban background clutter will be presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a novel method to quickly detect and track objects of low resolution within an image frame by comparing
dense, oriented gradient features at multiple scales within an object chip. The proposed method uses vector correlation
between sets of oriented Haar filter responses from within a local window and an object library to create similarity
measures, where peaks indicate high object probability. Interest points are chosen based on object shape and size so that
each point represents both a distinct spatial location and the shape segment of the object. Each interest point is then
independently searched in subsequent frames, where multiple similarity maps are fused to create a single object
probability map. This method executes in real time by reducing feature calculations and approximations using box
filters and integral images. We achieve invariance to rotation and illumination, because we calculate interest point
orientation and normalize the feature vector scale. The method creates a feature set from a small and localized area,
allowing for accurate detections in low resolution scenarios. This approach can also be extended to include the detection
of partially occluded objects through calculating individual interest point feature vector correlations and clustering points
together. We have tested the method on a subset of the Columbus Large Image Format (CLIF) 2007 dataset, which
provides various low-pixel-on-object moving and stationary vehicles with varying operating conditions. This method
provides accurate results with minimal parameter tuning for robust implementation on aerial, low pixel-on-object data
sets for automated classification applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Optical flow-based tracking methods offer the promise of precise, accurate, and reliable analysis of motion, but they
suffer from several challenges such as elimination of background movement, estimation of flow velocity, and optimal
feature selection. Wavelet approximations can offer similar benefits and retain spatial information at coarser scales,
while optical flow estimation increases with the reduction of finer details of moving objects. Optical flow methods often
suffer from significant computational overload. In this study, we have investigated the necessary processing steps to
increase detection and estimation accuracy, while effectively reducing computation time through the reduction of the
image frame size. We have implemented an object tracking algorithm using the optical flow calculated from a phase
change between representative coarse wavelet coefficients in subsequent image frames. We have also compared phasebased
optical flow with two versions of intensity-based optical flow to determine which method produces superior
results under specific operational conditions. The investigation demonstrates the feasibility of using phase-based optical
flow with wavelet approximations for object detection and tracking of low resolution aerial vehicles. We also
demonstrate that this method can work in tandem with feature-based tracking methods to increase tracking accuracy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper presents object profile classification results using range and speed independent features from an infrared
profiling sensor. The passive infrared profiling sensor was simulated using a LWIR camera. Field data collected near the
US-Mexico border to yield profiles of humans and animals is reported. Range and speed independent features based on
height and width of the objects were extracted from profiles. The profile features were then used to train and test three
classification algorithms to classify objects as humans or animals. The performance of Naïve Bayesian (NB), K-Nearest
Neighbors (K-NN), and Support Vector Machines (SVM) are compared based on their classification accuracy. Results
indicate that for our data set all three algorithms achieve classification rates of over 98%. The field data is also used to
validate our prior data collections from more controlled environments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.