Machine Learning (ML) and Artificial intelligence (AI) have increased automation potential within defense applications such as border protection, compound security, and surveillance applications. Advances in low-size weight and power (SWAP) computing platforms and unmanned aerial systems (UAS) have enabled autonomous systems to meet the critical needs of future defense systems. Recent academic advances in deep learning aided computer vision yielding impressive results on object detection and recognition, necessary capabilities to enable autonomy in defense applications. These advances, often open-sourced, enable the opportunistic integration of state-of-the-art (SOTA) algorithms. However, these systems require a large amount of object-relevant data to transfer from general academic domains to more relevant situations. Additionally, UAS systems require costly verification and validation of autonomy logic. These challenges can lead to high costs for both training data generation and costly field autonomy integration and testing activities. To address these challenges, in conjunction with partners, Elbit America has developed a multipurpose synthetic simulation environment capable of generating synthetic training data and prototyping, verifying, and validating autonomous distributed behaviors. We integrated a thermal modeling capability into Unreal Engine to create realistic training data by enabling the real-time simulation of SWIR, MWIR, and LWIR sensors. This radiometrically correct sensor model capability enables the simulation-based training data generation for our object recognition and classification pipeline, called Rapid Algorithm Development and Deployment (RADD). Several drones were instantiated using emulated flight controllers to enable end-to-end autonomy training and development before hardware availability. Herein, we describe an overview of the simulation environment and its relevance to detection, classification, and distributed autonomous decision-making.
A multi-modal (hyperspectral, multispectral, and LIDAR) imaging data collection campaign was conducted just south of Rochester New York in Avon, NY on September 20, 2012 by the Rochester Institute of Technology (RIT) in conjunction with SpecTIR, LLC, the Air Force Research Lab (AFRL), the Naval Research Lab (NRL), United Technologies Aerospace Systems (UTAS) and MITRE. The campaign was a follow on from the SpecTIR Hyperspectral Airborne Rochester Experiment (SHARE) from 2010. Data was collected in support of the eleven simultaneous experiments described here. The airborne imagery was collected over four different sites with hyperspectral, multispectral, and LIDAR sensors. The sites for data collection included Avon, NY, Conesus Lake, Hemlock Lake and forest, and a nearby quarry. Experiments included topics such as target unmixing, subpixel detection, material identification, impacts of illumination on materials, forest health, and in-water target detection. An extensive ground truthing effort was conducted in addition to collection of the airborne imagery. The ultimate goal of the data collection campaign is to provide the remote sensing community with a shareable resource to support future research. This paper details the experiments conducted and the data that was collected during this campaign.
Commercial multispectral satellite sensors spend much of their time over the oceans. NRL has demonstrated an automatic processing system for finding ships at sea using commercially available multispectral data. To distinguish ships from whitecaps and clouds, a water/cloud clutter subspace is estimated and a continuum fusion derived anomaly detection algorithm is applied. This provides a maritime awareness capability with an acceptable detection rate while maintaining a low rate of false alarms. The system also provides a confidence metric, which can be used to further limit the false alarm rate.
The continuum fusion (CF) methodology produces new classes of multivariate detection algorithms, some of which have been used in spectral applications. CF principles apply to model-based problems in which not all parameter values are known, a common circumstance in hyperspectral operations. We reviewed the principal theoretical and applied CF results devised to date, summarized recent experimental results, and discussed in detail an important class of algorithms that illustrate the design freedom CF affords. Finally, we reviewed the fundamental CF principles as applied to a new category of model parameters only recently considered, involving a distinction in the form of a constraint that is not recognized by older methods.
The Naval Research Laboratory has developed and demonstrated an autonomous multi-sensor motion-tracking and
interrogation system that reduces the workload for analysts by automatically finding moving objects, and then
presenting high-resolution images of those objects with little-to-no human input. Intelligence, surveillance and
reconnaissance (ISR) assets in the field generate vast amounts of data that can overwhelm human operators and can
severely limit an analyst's ability to generate intelligence reports in operationally relevant timeframes. This multiuser
tracking capability enables the system to manage the collection of imagery without continuous monitoring by a
ground or airborne operator, thus requiring fewer personnel and freeing up operational assets. During flight tests,
March 2011, multiple real-time motion-target-indicator (MTI) tracks generated by a wide-area persistent
surveillance sensor (WAPSS) were autonomously cross-cued to a high-resolution narrow filed-of-view interrogation
sensor via an airborne network. Both sensors were networked by the high-speed Tactical Reachback Extended
Communications (TREC) data-link provided by the NRL Information Technology Division.
The availability of imagery simultaneously collected from sensors of disparate modalities enhances an image analyst's
situational awareness and expands the overall detection capability to a larger array of target classes. Dynamic
cooperation between sensors is increasingly important for the collection of coincident data from multiple sensors either
on the same or on different platforms suitable for UAV deployment. Of particular interest is autonomous collaboration
between wide area survey detection, high-resolution inspection, and RF sensors that span large segments of the
electromagnetic spectrum. The Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory
(SDL) is building sensors with such networked communications capability and is conducting field tests to demonstrate
the feasibility of collaborative sensor data collection and exploitation. Example survey / detection sensors include:
NuSAR (NRL Unmanned SAR), a UAV compatible synthetic aperture radar system; microHSI, an NRL developed
lightweight hyper-spectral imager; RASAR (Real-time Autonomous SAR), a lightweight podded synthetic aperture
radar; and N-WAPSS-16 (Nighttime Wide-Area Persistent Surveillance Sensor-16Mpix), a MWIR large array gimbaled
system. From these sensors, detected target cues are automatically sent to the NRL/SDL developed EyePod, a high-resolution,
narrow FOV EO/IR sensor, for target inspection. In addition to this cooperative data collection, EyePod's
real-time, autonomous target tracking capabilities will be demonstrated. Preliminary results and target analysis will be
presented.
The potential of a new class of detection algorithms is demonstrated on an object of practical interest. The continuum
fusion (CF) [1] methodology is applied to a linear subspace model. A new algorithm results from first invoking a fusion
interpretation of a conventional GLR test and then modifying it with CF methods. Usual performance is enhanced in
two ways. First the Gaussian clutter model is replaced by a Laplacian distribution, which is not only more realistic in its
tail behavior but, when used in a hypothesis test, also creates decision surfaces more selective than the hyperplanes
associated with linear matched filters. Second, a fusion flavor is devised that generalizes the adaptive coherence
estimator (ACE) [2, 3] algorithm but has more design flexibility. An IDL/ENVI user interface has been developed and
will be described.
Recent successes in detecting specific materials with hyperspectral sensing systems belie the challenging problems that
loom. Easy targets, spectrally distinct from clutter, challenge neither detection algorithms, nor the methods used to translate
laboratory signatures into field spectra. The full promise of wide area autonomous detection with hyperspectral systems
will not be met using rudimentary algorithms, such as the linear matched filter or the ACE algorithm. Nor will signature
translation methods that produce a single radiance estimate suffice. This paper suggests a new methodology for
addressing future challenges, along with signature characterization protocols that would enable advanced detection capabilities.
The most widespread methods of anomaly detection in hyperspectral imagery (HSI) are the RX algorithm and its
variants (e.g. Subspace RX). RX is optimal for any unimodal elliptically contoured distribution (ECD), and in certain
data sets, it misinterprets any deviations from this model as true anomalies. Singleton outliers are by definition
anomalous, but other RX detections can arise from less severe departures from the ECD, in the form of spectral
"prominences." We describe a method that mitigates such persistent false alarms by augmenting RX in a recursive
process with truncated versions of the Adaptive Cosine Estimator (ACE). ACE is applied to RX exceedances that arise
from prominences, bulges appearing in the whitened clutter distribution that indicate anisotropy. The ACE-augmented
RX decision surface resembles a sea urchin.
KEYWORDS: Signal to noise ratio, Point spread functions, Sensors, Remote sensing, RGB color model, Phase retrieval, Space telescopes, Telescopes, Optical transfer functions, Mirrors
Sparse-aperture (SA) telescopes are a technology of interest in the field of remote sensing. Significant optical
resolution can be achieved by an array of sub-apertures, mitigating size and weight limitations of full aperture
space-deployed sensors. Much of the analysis to date has been done with the assumption that an extended scene
is spectrally flat and each pixel has the same spectrum (gray-world assumption). Previous work has found the
gray-world assumption is not valid when imaging a spectrally diverse scene and/or when the optical configuration
is heavily aberrated. Broadband phase diversity (BPD) is an
image-based method to detect the aberrations of a
system. It also assumes a gray-world. Digital simulations that quantify the limitations of BPD with respect
to spectral diversity of the extended scene, the RMS of the optical path difference (OPD), noise of the system,
and band width of the sensor are presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.