The detection and characterization of dismount activity is of increasing interest, particularly using radar to allow for
day/night operation from long range. Current RF dismount sensing concepts either employ short coherent intervals with
fine range resolution or long coherent intervals with fine Doppler resolution. We propose use of both fine range
resolution and long coherent intervals to achieve fine Doppler resolution. When dismounts are moving, this introduces
the added complication of micro-range/Doppler signature drift through range-Doppler resolution cells. In this paper, we
describe potential methods for focusing the signatures of moving dismounts, and then analyze the focused signature for
potential features that might lead to the automatic classification of the dismounts into several categories.
This paper presents a distributed multi-modality sensor network concept for vehicle classification within perimeter of a
surveillance system. This perimeter surveillance concept represents a "Virtual RF Fence" consisting of remotely
located electro-optic surveillance cameras and a standoff range radar system. The perimeter surveillance system
vigilantly monitors the field and each time a vehicle crosses the virtual RF fence it informs the surveillance cameras to
actively monitor the activity of vehicles as it passes through the field. This paper describes the methodologies applied
for processing the EO imagery data including target vehicle segmentation from background, vehicle shadow
elimination, vehicle feature vector generation, and a neural network approach for vehicle classification. A metric is also
proposed for evaluation of performance of the vehicle classification technique.
This paper outlines a concept for exploiting UAV (Unmanned Aerial Vehicle) trajectories for detecting slowly moving
targets. All the analysis and simulation results are reported under the assumption of a circular UAV trajectory with
various degrees of localized perturbations in the neighborhood of a given circular trajectory. These trajectory
perturbations are introduced and investigated in order to develop intelligent processing algorithms for purposes of
detecting slowly moving targets. The basic concept is based on collecting sub-apertures of data over a given set of
localized trajectories and intelligently parsing the collected data based on time-varying angle estimates between the
localized UAV trajectory and subsets of a collection of moving point targets. This parsed data is intelligently combined
over large SAR integration sub-intervals and intervals to develop a novel approach to detecting moving targets with large
variations in speed and target trajectory. Simulation results are reported for three different trajectory perturbation
It has recently become apparent that dismount tracking from non-EO based sources will have a
large positive impact on urban operations. EO / camera imaging is subject to line of site and
weather conditions which makes it a non-robust source for dismount tracking. Other sensors
exist (e.g. radar) to track dismount targets; however, little radar dismount data exists. This paper
examines the capability to generate synthetic and measured dismount data sets for radar
frequency (RF) processing. For synthetic data, we used the PoserTM program to generate 500
facet models of human dismount walking. Then we used these facet models with Xpatch to
generate synthetic wideband radar data. For measured dismount data, we used a multimode (X-Band
and Ku-Band) radar system to collect RF data of volunteer human (dismount) targets.
We propose a novel approach to focus and geolocate moving targets in synthetic aperture radar imagery. The initial step is to detect the position of the target using an automatic target detection algorithm. The next step is to estimate the target cross-range velocity using sequential sub-apertures; this is done by forming low resolution images and estimating position as a function of sub-aperture, thus yielding an estimate of the cross-range velocity. This cross-range estimate is then used to bound the search range for a bank of focusing filters. Determining the proper velocity that yields the best focused target defines an equation for the target velocity, however both components of the targets velocity can not be determined from a single equation. Therefore, a second image with a slightly different heading is needed to yield a second focusing velocity, and then having a system of two equations and two unknowns a solution can be obtained. Once the target velocity is known the proper position can be determined from the range velocity. Synthetic data will be used with a point source target and both background clutter and noise added. The results support the development of staring radar applications with much larger synthetic aperture integration times in comparison to existing SAR modes. The basic idea of this approach is to trade-off the development of expensive phased-array technology for GMTI applications with the potential development of advanced processing methods that show potential for processing data over very large aperture integration intervals, to obtain similar GMTI geolocation results that would be compatible with current radar technology.
Our proposed research is to focus and geolocate moving targets in synthetic aperture radar imagery. The first step is to estimate the target cross-range velocity using sequential sub-apertures; this is done by forming low resolution images and estimating position as a function of sub-aperture, thus yielding an estimate of the cross-range velocity. This cross-range estimate is then used to bound the search range for a bank of focusing filters. Determining the proper velocity that yields the best focused target defines an equation for the target velocity, however both components of the targets velocity can not be determined from a single equation. Therefore, a second image with a slightly different heading is needed to yield a second focusing velocity, and then having a system of two equations and two unknowns a solution can be obtained. Once the target velocity is known the proper position can be determined from the range velocity.
This paper extends simulation and target detection results from an investigation entitled "Self-Training Algorithms for Ultra-wideband SAR Target Detection" that was conducted last year and presented at the 2003 SPIE Aerosense Conference on "Algorithms for Synthetic Aperture Radar Imagery." Under this approach, simulated SAR impulse clutter data was generated by modulating a tophat model for the SAR video phase history with K-distributed data models. Targets were synthesized and "instanced" within the SAR image via the application of a dihedral model to represent broadside targets. For this paper, these models are extended and generalized by developing a set of models that approximate major scattering mechanisms due to terrain relief and approximate major scattering mechanisms due to scattering from off-angle targets. Off-angle targets are difficult to detect at typical ultra-wideband radar frequencies and are denoted as "diffuse scatterers." Potential approaches for detecting synthetic off-angle targets that demonstrate this type of "diffuse scattering" are developed and described in the algorithms and results section of the paper. A preliminary set of analysis outputs are presented with synthetic data from the resulting simulation testbed.
An ultra-wideband (UWB) synthetic aperture radar (SAR) simulation technique that employs physical and statistical models is developed and presented. This joint physics/statistics based technique generates images that have many of the "blob-like" and "spiky" clutter characteristics of UWB radar data in forested regions while avoiding the intensive computations required for the implementation of low-frequency numerical electromagnetic simulation techniques.
Approaches towards developing "self-training" algorithms for UWB radar target detection are investigated using the results of this simulation process. These adaptive approaches employ some form of modified singular value decomposition (SVD) algorithm where small blocks of data in the neighborhood of a sliding test window are processed in real-time in an effort to estimate localized clutter characteristics. These real-time local clutter models are then used to cancel clutter in the sliding test window. Comparative results from three SVD-based approaches to adaptive and "self-trained" target detection algorithms are reported. These approaches are denoted as "Energy-Normalized SVD", "Condition-Statistic SVD", and "Terrain-Filtered SVD". The results indicate that the "Terrain-Filtered SVD" approach, where a pre-filter is applied in an effort to eliminate severe clutter discretes that adversely effect performance, appears promising for the purposes of developing "self-training" algorithms for applications that may require localized "on-the-fly" training due to a lack of accurate off-line training data.
A number of aspects of ultra-wideband radar target detection analysis and algorithm development are addressed. The first portion of the paper describes a bi-modal technique for modeling ultra-wideband radar clutter. This technique was developed based on an analysis of ultra-wideband radar phenomenology. Synthetic image samples that were generated by this modeling process are presented. This sample set is characterized by a number of physical parameters. The second portion of this paper describes an approach to developing a class of filters, known as rank-order filters, for ultra-wideband radar target detection applications. The development of a new rank-order filter denoted as a discontinuity filter is presented. Comparative target detection results are presented as a function of data model parameters. The comparative results include discontinuity filter performance versus the performance of median filtering and CFAR filtering.
A number of spectral feature computations for purposes of discriminating military targets from clutter are currently under investigation within the Radar Branch at the Air Force Research Laboratory. Results from a comparative performance analysis of these features are reported. The development and analysis of spectral phase computations are of particular interest since, for some 'hard clutter' environments, the use of amplitude-based discriminants does not generate a sufficiently low false alarm rate. These phase computations are based on the analysis of the Fourier phase function, analysis of the phase spectral density, and analysis of the bispectrum. Additional spectral features, such as features based on angular diversity, are also included within the scope of this investigation. The data for this investigation is comprised of some SAR images and image chips that were collected and generated under the DARPA/Air Force Moving and Stationary Target Acquisition and Recognition (MSTAR) Program.