Textron’s Advanced MicroObserver(R) is a next generation remote unattended ground sensor system (UGS) for border security, infrastructure protection, and small combat unit security. The original MicroObserver(R) is a sophisticated seismic sensor system with multi-node fusion that supports target tracking. This system has been deployed in combat theaters. The system’s seismic sensor nodes are uniquely able to be completely buried (including antennas) for optimal covertness. The advanced version adds a wireless day/night Electro-Optic Infrared (EOIR) system, cued by seismic tracking, with sophisticated target discrimination and automatic frame capture features. Also new is a field deployable Gateway configurable with a variety of radio systems and flexible networking, an important upgrade that enabled the research described herein. BattleHawkTM is a small tube launched Unmanned Air Vehicle (UAV) with a warhead. Using transmitted video from its EOIR subsystem an operator can search for and acquire a target day or night, select a target for attack, and execute terminal dive to destroy the target. It is designed as a lightweight squad level asset carried by an individual infantryman. Although BattleHawk has the best loiter time in its class, it’s still relatively short compared to large UAVs. Also it’s a one-shot asset in its munition configuration. Therefore Textron Defense Systems conducted research, funded internally, to determine if there was military utility in having the highly persistent MicroObserver(R) system cue BattleHawk’s launch and vector it to beyond visual range targets for engagement. This paper describes that research; the system configuration implemented, and the results of field testing that was performed on a government range early in 2013. On the integrated system that was implemented, MicroObserver(R) seismic detections activated that system’s camera which then automatically captured images of the target. The geo-referenced and time-tagged MicroObserver(R) target reports and images were then automatically forwarded to the BattleHawk Android-based controller. This allowed the operator to see the intruder (classified and geo-located) on the map based display, assess the intruder as likely hostile (via the image), and launch BattleHawk with the pre-loaded target coordinates. The operator was thus able to quickly acquire the intended target (without a search) and initiate target engagement immediately. System latencies were a major concern encountered during the research.
We show design and performance results for an Unattended Ground Sensors (UGS) Automatic Target Recognition
(ATR) target classifier using infrared (IR) imagery. Our goal was to develop a basic ATR capability to separate human
vs. animal vs. vehicle vs. non-target. Our current UGS video capability accurately detects tracks and transmits targetcentered
long wave infrared and visible imagery to a base station. We demonstrate an ATR capability to classify and
transmit only targets of interest to the user while excluding others. We describe the ATR development process which
includes data collection, building a truthed dataset, feature development, classifier training and performance evaluation.
Helicopters present a serious threat to high security facilities such as prisons, nuclear sites, armories, and VIP
compounds. They have the ability to instantly bypass conventional security measures focused on ground threats such as
fences, check-points, and intrusion sensors. Leveraging the strong acoustic signature inherent in all helicopters, this
system would automatically detect, classify, and accurately track helicopters using multi-node acoustic sensor fusion. An
alert would be generated once the threat entered a predefined 3-dimension security zone in time for security personnel to
repel the assault. In addition the system can precisely identify the landing point on the facility grounds.
Over the past decade, technological advances have enabled the use of increasingly intelligent systems for battlefield
surveillance. These systems are triggered by a combination of external devices including acoustic and seismic sensors.
Such products are mainly used to detect vehicles and personnel.
These systems often use infra-red imagery to record environmental information, but Textron Defense Systems' Terrain
Commander is one of a small number of systems which analyze these images for the presence of targets. The Terrain
Commander combines acoustic, infrared, magnetic, seismic, and visible spectrum sensors to detect nearby targets in
military scenarios. When targets are detected by these sensors, the cameras are triggered and images are captured in the
infrared and visible spectrum.
In this paper we discuss a method through which such systems can perform target tracking in order to record and
transmit only the most pertinent surveillance images. This saves bandwidth which is crucial because these systems often
use communication systems with throughputs below 2400bps. This method is expected to be executable on low-power
processors at frame rates exceeding 10HZ.
We accomplish this by applying target activated frame capture algorithms to infra-red video data. The target activated
frame capture algorithms combine edge detection and motion detection to determine the best frames to be transmitted to
the end user. This keeps power consumption and bandwidth requirements low. Finally, the results of the algorithm are
This project focuses on developing electro-optic algorithms which rank images by their likelihood of containing
vehicles and people. These algorithms have been applied to images obtained from Textron's Terrain Commander 2
(TC2) Unattended Ground Sensor system.
The TC2 is a multi-sensor surveillance system used in military applications. It combines infrared, acoustic, seismic,
magnetic, and electro-optic sensors to detect nearby targets. When targets are detected by the seismic and acoustic
sensors, the system is triggered and images are taken in the visible and infrared spectrum.
The original Terrain Commander system occasionally captured and transmitted an excessive number of images,
sometimes triggered by undesirable targets such as swaying trees. This wasted communications bandwidth, increased
power consumption, and resulted in a large amount of end-user time being spent evaluating unimportant images. The
algorithms discussed here help alleviate these problems.
These algorithms are currently optimized for infra-red images, which give the best visibility in a wide range of
environments, but could be adapted to visible imagery as well. It is important that the algorithms be robust, with minimal
dependency on user input. They should be effective when tracking varying numbers of targets of different sizes and
orientations, despite the low resolutions of the images used. Most importantly, the algorithms must be appropriate for
implementation on a low-power processor in real time. This would enable us to maintain frame rates of 2 Hz for
effective surveillance operations.
Throughout our project we have implemented several algorithms, and used an appropriate methodology to
quantitatively compare their performance. They are discussed in this paper.
Operational trials of Textron Systems’ Terrain Commander unattended ground sensor (UGS) system are described. Terrain Commander is a powerful new concept in surveillance and remote situational awareness. It leverages a diverse suite of sophisticated unattended ground sensors, day/night electro-optics, satellite data communications, and an advanced Windows based graphic user interface. Terrain Commander OASIS (Optical Acoustic SATCOM Integrated Sensor) provides next generation target detection, classification, and tracking through smart sensor fusion of beam-forming acoustic, seismic, passive infrared, and magnetic sensors. With its fully integrated SATCOM system using internet protocols, virtually any site in the world can be monitored from almost any other location. Multiple remote sites such as airfields, landing zones, base perimeters, road junctions, flanks, and border crossings are monitored with ease from a central location. Intruding personnel or vehicles are automatically detected, classified, and imaged. Results from early operational trials in the outback of Australia and in various locations in the US are described. Probability of detection and recognition against a wide variety of targets including personnel, military and civilian vehicles, in-shore watercraft, and low altitude aircraft are discussed. Environments include snow cover, tropical savannah, rainforest, and woodlands. Experience with alternative SATCOM systems during the trials is also touched upon.