The US Army Research Laboratory (ARL) has recently developed the Spectrally Agile Frequency-Incrementing Reconfigurable (SAFIRE) radar system during its ongoing research to provide ground vehicular standoff detection and classification of obscured and/or buried explosive hazards. The system is a stepped-frequency radar (SFR) that can be reconfigured to omit operation within specific sub-bands of its 1700 MHz operating band (300 MHz to 2000 MHz). It employs two transmit antennas and an array of 16 receive antennas; the antenna types are quad-ridged horn and Vivaldi, respectively. The system is vehicle-mounted and can be interchanged between forward- or side-looking configurations. In order to assess and evaluate the performance of the SAFIRE radar system in a realistic deployment scenario, ARL has collected SAFIRE data using militarily-relevant threats at an arid US Army test site. This paper presents an examination of radar imagery from these data collection campaigns. A discussion on the image formation techniques is presented and recently processed radar imagery is provided. A summary of the radars performance is presented and recommendations for further improvements are discussed.
The Spectrally Agile Frequency-Incrementing Reconfigurable (SAFIRE) mobile radar system was developed and exercised at an arid U.S. test site. The system can detect hidden target using radar, a global positioning system (GPS), dual stereo color cameras, and dual stereo thermal cameras. An Augmented Reality (AR) software interface allows the user to see a single fused video stream containing the SAR, color, and thermal imagery. The stereo sensors allow the AR system to display both fused 2D imagery and 3D metric reconstructions, where the user can "fly" around the 3D model and switch between the modalities.
The U.S. Army Research Laboratory has developed the Spectrally Agile Frequency-Incrementing Reconfigurable (SAFIRE) radar, which is capable of imaging concealed/buried targets using forward- and side-looking configurations. The SAFIRE radar is vehicle-mounted and operates from 300 MHz–2 GHz; the step size can be adjusted in multiples of 1 MHz. It is also spectrally agile and capable of excising frequency bands, which makes it ideal for operation in congested and/or contested radio frequency (RF) environments. Furthermore, the SAFIRE radar receiver has a super-heterodyne architecture, which was designed so that intermodulation products caused by interfering signals could be easily filtered from the desired received signal. The SAFIRE system also includes electro-optical (EO) and infrared (IR) cameras, which can be fused with radar data and displayed in a stereoscopic augmented reality user interface. In this paper, recent upgrades to the SAFIRE system are discussed and results from the SAFIRE’s initial field tests are presented.
The Synchronous Impulse Reconstruction (SIRE) forward-looking radar, developed by the U.S. Army Research Laboratory
(ARL), can detect concealed targets using ultra-wideband synthetic aperture technology. The SIRE radar has been mounted
on a Ford Expedition and combined with other sensors, including a pan/tilt/zoom camera, to test its capabilities of concealed
target detection in a realistic environment. Augmented Reality (AR) can be used to combine the SIRE radar image with
the live camera stream into one view, which provides the user with information that is quicker to assess and easier to
understand than each separated.
In this paper we present an AR system which utilizes a global positioning system (GPS) and inertial measurement
unit (IMU) to overlay a SIRE radar image onto a live video stream. We describe a method for transforming 3D world
points in the UTM coordinate system onto the video stream by calibrating for the intrinsic parameters of the camera. This
calibration is performed offline to save computation time and achieve real time performance. Since the intrinsic parameters
are affected by the zoom of the camera, we calibrate at eleven different zooms and interpolate. We show the results of a
real time transformation of the SAR imagery onto the video stream. Finally, we quantify both the 2D error and 3D residue
associated with our transformation and show that the amount of error is reasonable for our application.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.