A sophisticated real time architecture for capturing relevant battlefield information of personnel and terrestrial events from a network of mast based imaging and unmanned aerial systems (UAS) with target detection, tracking, classification and visualization is presented. Persistent surveillance of personnel and vehicles is achieved using a unique spatial and temporally invariant motion detection and tracking algorithm for mast based cameras in combination with aerial remote sensing to autonomously monitor unattended ground based sensor networks. UAS autonomous routing is achieved using bio-inspired algorithms that mimic how bacteria locate nutrients in their environment. Results include field test data, performance and lessons learned. The technology also has application to detecting and tracking low observables (manned and UAS), counter MANPADS, airport bird detection and search and rescue operations.
Classifying acoustic signals detected by distributed sensor networks is a difficult problem due to the wide variations
that can occur in the transmission of terrestrial, subterranean, seismic and aerial events. An acoustic event classifier was
developed that uses particle swarm optimization to perform a flexible time correlation of a sensed acoustic signature to
reference data. In order to mitigate the effects from interference such as multipath, the classifier fuses signatures from
multiple sensors to form a composite sensed acoustic signature and then automatically matches the composite signature
with reference data. The approach can classify all types of acoustic events but is particularly well suited to explosive
events such as gun shots, mortar blasts and improvised explosive devices that produce an acoustic signature having a
shock wave component that is aperiodic and non-linear. The classifier was applied to field data and yielded excellent
results in terms of reconstructing degraded acoustic signatures from multiple sensors and in classifying disparate
acoustic events.
A team consisting of Teledyne Scientific Company, the University of California at Santa Barbara, the Army
Research Laboratory, the Engineer Research and Development Center, and IBM UK is developing technologies in
support of automated data exfiltration from heterogeneous battlefield sensor networks to enhance situational awareness
for dismounts and command echelons. Unmanned aerial vehicles (UAV) provide an effective means to autonomously
collect data from a sparse network of unattended ground sensors (UGSs) that cannot communicate with each other.
UAVs are used to reduce the system reaction time by generating autonomous collection routes that are data-driven. Bioinspired
techniques for autonomous search provide a novel strategy to detect, capture and fuse data from heterogeneous
sensor networks. The bio-inspired algorithm is based on chemotaxis or the motion of bacteria seeking nutrients in their
environment. Field tests of a bio-inspired system that routed UAVs were conducted in June 2011 at Camp Roberts, CA.
The field test results showed that such a system can autonomously detect and locate the source of terrestrial events with
very high accuracy and visually verify the event. In June 2011, field tests of the system were completed and include the
use of multiple autonomously controlled UAVs, detection and disambiguation of multiple acoustic events occurring in
short time frames, optimal sensor placement based on local phenomenology and the use of the International Technology
Alliance (ITA) Sensor Network Fabric. The system demonstrated TRL 6 performance in the field at Camp Roberts.
A team consisting of Teledyne Scientific Company, the University of California at Santa Barbara and the Army
Research Laboratory* is developing technologies in support of automated data exfiltration from heterogeneous
battlefield sensor networks to enhance situational awareness for dismounts and command echelons. Unmanned aerial
vehicles (UAV) provide an effective means to autonomously collect data from a sparse network of unattended ground
sensors (UGSs) that cannot communicate with each other. UAVs are used to reduce the system reaction time by
generating autonomous collection routes that are data-driven. Bio-inspired techniques for search provide a novel
strategy to detect, capture and fuse data. A fast and accurate method has been developed to localize an event by fusing
data from a sparse number of UGSs. This technique uses a bio-inspired algorithm based on chemotaxis or the motion of
bacteria seeking nutrients in their environment. A unique acoustic event classification algorithm was also developed
based on using swarm optimization. Additional studies addressed the problem of routing multiple UAVs, optimally
placing sensors in the field and locating the source of gunfire at helicopters. A field test was conducted in November of
2009 at Camp Roberts, CA. The field test results showed that a system controlled by bio-inspired software algorithms
can autonomously detect and locate the source of an acoustic event with very high accuracy and visually verify the
event. In nine independent test runs of a UAV, the system autonomously located the position of an explosion nine
times with an average accuracy of 3 meters. The time required to perform source localization using the UAV was on
the order of a few minutes based on UAV flight times. In June 2011, additional field tests of the system will be
performed and will include multiple acoustic events, optimal sensor placement based on acoustic phenomenology and
the use of the International Technology Alliance (ITA) Sensor Network Fabric (IBM).
Teledyne Scientific Company, the University of California at Santa Barbara (UCSB) and the Army Research Lab
are developing technologies for automated data exfiltration from heterogeneous sensor networks through the Institute
for Collaborative Biotechnologies (ICB). Unmanned air vehicles (UAV) provide an effective means to autonomously
collect data from unattended ground sensors (UGSs) that cannot communicate with each other. UAVs are used to
reduce the system reaction time by generating autonomous data-driven collection routes. Bio-inspired techniques for
search provide a novel strategy to detect, capture and fuse data across heterogeneous sensors. A fast and accurate
method has been developed for routing UAVs and localizing an event by fusing data from a sparse number of UGSs; it
leverages a bio-inspired technique based on chemotaxis or the motion of bacteria seeking nutrients in their environment.
The system was implemented and successfully tested using a high level simulation environment using a flight simulator
to emulate a UAV. A field test was also conducted in November 2009 at Camp Roberts, CA using a UAV provided by
AeroMech Engineering. The field test results showed that the system can detect and locate the source of an acoustic
event with an accuracy of about 3 meters average circular error.
The Army currently employs heterogeneous unattended ground sensors (UGSs) using a sparse deployment to maximize coverage, minimize pilferage and to monitor terrain bottlenecks. A team consisting of Teledyne Scientific Company, the University of California at Santa Barbara and the US Army Research Laboratory (ARL) is developing
technologies in support of automated data exfiltration from heterogeneous battlefield sensor networks as part of a US
Army contract1 with the Institute for Collaborative Biotechnologies (ICB). The ICB program is developing a new system consisting of novel bio-inspired software algorithms for autonomous operations that will leverage proven research to monitor sensor networks from extended ranges, that will collect data in a timely fashion, that will
collaboratively control the motion of a sparse network of collectors (e.g., UAVs) using bio-inspired sampling, that will
accurately detect and localize field events and will fuse and classify sensed data. A new bio-inspired event discovery
technique will enable fusion of sensor observations at low SNR without requiring a prior model for the event signature;
this is a first step towards sensor networks that are capable of learning. The program will also provide both laboratory
and field demonstrations of these capabilities supported through ARL by leveraging available resources.
Hyperspectral image sets are three dimensional data volumes that are difficult to exploit by manual means because they are comprised of multiple bands of image data that are not easily visualized or assessed. GTE Government Systems Corporation has developed a system that utilizes Evolutionary Computing techniques to automatically identify materials in terrain hyperspectral imagery. The system employs sophisticated signature preprocessing and a unique combination of non- parametric search algorithms guided by a model based cost function to achieve rapid convergence and pattern recognition. The system is scaleable and is capable of discriminating and identifying pertinent materials that comprise a specific object of interest in the terrain and estimating the percentage of materials present within a pixel of interest (spectral unmixing). The method has been applied and evaluated against real hyperspectral imagery data from the AVIRIS sensor. In addition, the process has been applied to remotely sensed infrared spectra collected at the microscopic level to assess the amounts of DNA, RNA and protein present in human tissue samples as an aid to the early detection of cancer.
Multispectral and hyperspectral image sets contain large amounts of data which are difficult to exploit by manual means because they are comprised of multiple bands of image data that are not easily visualized or assessed.Non-literal imagery exploitation refers to a process that exploits non- spatial information by focusing on individual pixel signatures that span the spectral range of the sensor. GTE has developed a system that utilizes evolutionary computing techniques as a potential aid to imagery analysts to perform automatic object detection, recognition and materials identification on multispectral and hyperspectral imagery. The system employs sophisticated signature preprocessing and a unique combination of non-parametric search algorithms guided by a model based cost function to achieve rapid convergence and pattern recognition. The system is scaleable and is capable of discriminating decoys from real objects, identifying pertinent materials that comprise a specific object of interest and estimating the percentage of materials present within a pixel of interest.
Prior methods for tactical target reacquisition after a loss of track have used tree classifiers and template matchers. Examples of prior techniques include classifiers that are trained with a priori data which makes them somewhat intolerant to temporal and dynamic changes in the target pattern. Prior methods for reacquisition generally rely on proximity and area-based schemes. The disadvantage of these methods include their dependence on accurate and consistent segmentation in cluttered scenarios and their need for precise target position prediction. The random mapping network algorithm (RMNRA) offers a solution using a pattern memory and sparse feature matching technique. RMNRA assists the imaging tracker and improves tracking tenacity by reacquiring a tracked target after a loss of track has occurred. The reacquisition algorithm uses an associative memory to perform target pattern matching. The pattern matching technique is unique in that it is tolerant to some of the ambiguities that occur with classical template pattern matchers. Weighted pattern feature vectors are stored in a memory matrix to facilitate the matching of sensed and reference patterns dynamically over time. In addition, a sophisticated algorithm was designed to update the memory matrix over time to forget prior patterns as the target signature becomes stale over time and space. The algorithm has been implemented in real-time hardware and flight tested with an infrared sensor. The algorithm is discussed and results using real IR imagery are shown.
KEYWORDS: Image filtering, Sensors, Point spread functions, Image processing, Digital image processing, Signal to noise ratio, Signal detection, Image restoration, Image sensors, Infrared sensors
We consider the problem of point source target detection. It is desirable to remove sensor degradation from images to obtain a better object representation. Two techniques are considered: a simple four-pixel average and a more complex maximum entropy deconvolution approach. Each of the methods perform well under certain varying conditions. Our motivation is based on applications such as: (1) weapon systems for the Strategic Defense Initiative that require a capability to engage a multiple target threat at long ranges and (2) star tracking systems for satellite guidance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.