Convolutional neural networks (CNNs) are state-of-the-art techniques for image classification; however, CNNs require an extensive amount of training data to achieve high accuracy. This demand presents a challenge because the existing amount of measured synthetic aperture radar (SAR) data is typically limited to just a few examples and does not account for articulations, clutter, and other target or scene variability. Therefore, this research aimed to assess the feasibility of combining synthetic and measured SAR images to produce a classification network that is robust to operating conditions not present in measured data and that may adapt to new targets without necessarily training on measured SAR images. A network adapted from the CIFAR-10 LeNet architecture in MATLAB Convolutional Neural Network (MatConvNet) was first trained on a database of multiple synthetic Moving and Stationary Target Acquisition and Recognition (MSTAR) targets. After the network classified with almost perfect accuracy, the synthetic data was replaced with corresponding measured data. Only the first layer of filters was permitted to change in order to create a translation layer between synthetic and measured data. The low error rate of this experiment demonstrates that diverse clutter and target types not represented in measured training data may be introduced in synthetic training data and later recognized in measured test data.
Traditional synthetic aperture radar (SAR) systems tend to discard phase information of formed complex radar imagery
prior to automatic target recognition (ATR). This practice has historically been driven by available hardware storage,
processing capabilities, and data link capacity. Recent advances in high performance computing (HPC) have enabled
extremely dense storage and processing solutions. Therefore, previous motives for discarding radar phase information in
ATR applications have been mitigated. First, we characterize the value of phase in one-dimensional (1-D) radar range
profiles with respect to the ability to correctly estimate target features, which are currently employed in ATR algorithms
for target discrimination. These features correspond to physical characteristics of targets through radio frequency (RF)
scattering phenomenology. Physics-based electromagnetic scattering models developed from the geometrical theory of
diffraction are utilized for the information analysis presented here. Information is quantified by the error of target parameter
estimates from noisy radar signals when phase is either retained or discarded. Operating conditions (OCs) of signal-tonoise
ratio (SNR) and bandwidth are considered. Second, we investigate the value of phase in 1-D radar returns with
respect to the ability to correctly classify canonical targets. Classification performance is evaluated via logistic regression
for three targets (sphere, plate, tophat). Phase information is demonstrated to improve radar target classification rates,
particularly at low SNRs and low bandwidths.
Proc. SPIE. 8746, Algorithms for Synthetic Aperture Radar Imagery XX
KEYWORDS: MATLAB, Detection and tracking algorithms, Sensors, Synthetic aperture radar, Image processing, Digital filtering, Digital imaging, Signal processing, Data centers, Filtering (signal processing)
Legacy synthetic aperture radar (SAR) exploitation algorithms were image-based algorithms, designed to exploit
complex and/or detected SAR imagery. In order to improve the efficiency of the algorithms, image chips, or region
of interest (ROI) chips, containing candidate targets were extracted. These image chips were then used directly by
exploitation algorithms for the purposes of target discrimination or identification. Recent exploitation research
has suggested that performance can be improved by processing the underlying phase history data instead of
standard SAR imagery. Digital Spotlighting takes the phase history data of a large image and extracts the phase
history data corresponding to a smaller spatial subset of the image. In a typical scenario, this spotlighted phase
history data will contain much fewer samples than the original data but will still result in an alias-free image of
the ROI. The Digital Spotlight algorithm can be considered the first stage in a “two-stage backprojection” image
formation process. As the first stage in two-stage backprojection, Digital Spotlighting filters the original phase
history data into a number of “pseudo”-phase histories that segment the scene into patches, each of which contain
a reduced number of samples compared to the original data. The second stage of the imaging process consists
of standard backprojection. The data rate reduction offered by Digital Spotlighting improves the computational
efficiency of the overall imaging process by significantly reducing the total number of backprojection operations.
This paper describes the Digital Spotlight algorithm in detail and provides an implementation in MATLAB.
Three-dimensional (3-D) spotlight-mode synthetic aperture radar (SAR) images of point scatterers provide insight
into the achievable effectiveness of exploitation algorithms given a variety of operating parameters such
as elevation angle, azimuth or synthetic aperture extent, and frequency bandwidth. Circular SAR, using 360
degrees of azimuth, offers the benefit of persistent surveillance and the potential for 3-D image reconstruction
improvement compared with limited aperture SAR due in part to the increase in favorable viewing angles of
unknown objects. The response of a point scatter at the origin, or center of the imaging scene, is known and has
been quantified for circular SAR in prior literature by a closed-form solution. The behavior of a point scatterer
radially displaced from the origin has been previously characterized for circular SAR through implementation of
backprojection image reconstructions. Here, we derive a closed-form expression for the response of an arbitrarily
located point scatterer given a circular flight path. In addition, the behavior of the response of an off-center point
target is compared to that of a point scatterer at the origin. Symmetries within the 3-D point spread functions
(PSFs), or impulse response functions (IPRs), are also noted to provide knowledge of the minimum subset of
SAR images required to fully characterize the response of a particular point scatterer. Understanding of simple
scattering behavior can provide insight into the response of more complex targets, given that complicated targets
may sometimes be modeled as an arrangement of geometrically simple scattering objects.
This document describes a challenge problem whose scope is two-fold. The first aspect is to develop SAR CCD
algorithms that are applicable for X-band SAR imagery collected in an urban environment. The second aspect relates to
effective data compression of these complex SAR images, where quality SAR CCD is the metric of performance.
A set of X-band SAR imagery is being provided to support this development. To focus research onto specific areas of
interest to AFRL, a number of challenge problems are defined.
The data provided is complex SAR imagery from an AFRL airborne X-band SAR sensor. Some key features of this data
set are: 10 repeat passes, single phase center, and single polarization (HH). In the scene observed, there are multiple
buildings, vehicles, and trees. Note that the imagery has been coherently aligned to a single reference.
While many synthetic aperture radar (SAR) image formation techniques exist, two of the most intuitive methods
for implementation by SAR novices are the matched filter and backprojection algorithms. The matched filter and
(non-optimized) backprojection algorithms are undeniably computationally complex. However, the backprojection
algorithm may be successfully employed for many SAR research endeavors not involving considerably large
data sets and not requiring time-critical image formation. Execution of both image reconstruction algorithms
in MATLAB is explicitly addressed. In particular, a manipulation of the backprojection imaging equations is
supplied to show how common MATLAB functions, ifft and interp1, may be used for straight-forward SAR
image formation. In addition, limits for scene size and pixel spacing are derived to aid in the selection of an
appropriate imaging grid to avoid aliasing. Example SAR images generated though use of the backprojection
algorithm are provided given four publicly available SAR datasets. Finally, MATLAB code for SAR image
reconstruction using the matched filter and backprojection algorithms is provided.
Radar resolution in three dimensions is considered for circular synthetic apertures at a constant elevation angle.
A closed-form expression is derived for the far-field 3-D point spread function for a circular aperture of 360 degrees
azimuth and is used to revisit the traditional measures of resolution along the x, y and z spatial axes. However,
the limited angular persistence of reflectors encountered in practice renders the traditional measures inadequate
for circular synthetic aperture radar imaging. Two alternative measures for 3-D resolution are presented: a
nonparametric measure based on level sets of a reflector's signature and a statistical measure using the Cramer-
Rao lower bound on location estimation error. Both proposed measures provide a quantitative evaluation of
3-D resolution as a function of scattering persistence and radar system parameters. The analysis shows that
3-D localization of a reflector requires a combination of large radar cross section and large angular persistence.
In addition, multiple elevations or a priori target scattering models, if available, may be used to significantly
enhance 3-D resolution.