The U.S. Army Research Laboratory (ARL) has been investigating the utility of ultra-wideband (UWB) synthetic
aperture radar (SAR) technology for detecting concealed targets in various applications. We have designed and built a
vehicle-based, low-frequency UWB SAR radar for proof-of-concept demonstration in detecting obstacles for
autonomous navigation, detecting concealed targets (mines, etc.), and mapping internal building structures to locate
enemy activity. Although the low-frequency UWB radar technology offers valuable information to complement other
technologies due to its penetration capability, it is very difficult to comprehend the radar imagery and correlate the
detection list from the radar with the objects in the real world.
Using augmented reality (AR) technology, we can superimpose the information from the radar onto the video
image of the real world in real-time. Using this, Soldiers would view the environment and the superimposed graphics
(SAR imagery, detection locations, digital map, etc.) via a standard display or a head-mounted display. The
superimposed information would be constantly changed and adjusted for every perspective and movement of the user.
ARL has been collaborating with ITT Industries to implement an AR system that integrates the video data captured from
the real world and the information from the UWB radar. ARL conducted an experiment and demonstrated the real-time
geo-registration of the two independent data streams. The integration of the AR sub-system into the radar system is
underway. This paper presents the integration of the AR and SAR systems. It shows results that include the real-time
embedding of the SAR imagery and other information into the video data stream.