The problem of spectral reflectance retrieval of surfaces via remote hyperspectral imaging is challenging even in benign scenarios, and becomes dramatically more difficult under complex illumination conditions. Shadows, reflections from nearby structures, and atmospheric scattering can all severely impact the observed radiance from ground-level surfaces. In order to study this problem, MIT Lincoln Laboratory recently conducted an airborne data collection experiment that included hyperspectral, laser radar, and pan-chromatic modalities. A comprehensive ground truth data set and extensive efforts directed at sensor characterization makes this data set ideal for the development of hyperspectral exploitation algorithms.
Our interest is in data registration, object recognition and object tracking using 3D point clouds. There are three steps to our feature matching system: detection, description and matching. Our focus will be on the feature description step. We describe new rotation invariant 3D feature descriptors that utilize techniques from the successful 2D SIFT descriptors. We experiment with a variety of synthetic and real data to show how well our newly developed descriptors perform relative to a commonly used 3D descriptor, spin images. Our results show that our descriptors are more distinct than spin images while remaining rotation and translation invariant. The improvement in performance incomparison to spin images is most evident when an object has features that are mirror images of each other, due to symmetry.
Situation awareness and accurate Target Identification (TID) are critical requirements for successful battle management. Ground vehicles can be detected, tracked, and in some cases imaged using airborne or space-borne microwave radar. Obscurants such as camouflage net and/or tree canopy foliage can degrade the performance of such radars. Foliage can be penetrated with long wavelength microwave radar, but generally at the expense of imaging resolution. The goals of the DARPA Jigsaw program include the development and demonstration of high-resolution 3-D imaging laser radar (ladar) ensor technology and systems that can be used from airborne platforms to image and identify military ground vehicles that may be hiding under camouflage or foliage such as tree canopy. With DARPA support, MIT Lincoln Laboratory has developed a rugged and compact 3-D imaging ladar system that has successfully demonstrated the feasibility and utility of this application. The sensor system has been integrated into a UH-1 helicopter for winter and summer flight campaigns. The sensor operates day or night and produces high-resolution 3-D spatial images using short laser pulses and a focal plane array of Geiger-mode avalanche photo-diode (APD) detectors with independent digital time-of-flight counting circuits at each pixel. The sensor technology includes Lincoln Laboratory developments of the microchip laser and novel focal plane arrays. The microchip laser is a passively Q-switched solid-state frequency-doubled Nd:YAG laser transmitting short laser pulses (300 ps FWHM) at 16 kilohertz pulse rate and at 532 nm wavelength. The single photon detection efficiency has been measured to be > 20 % using these 32x32 Silicon Geiger-mode APDs at room temperature. The APD saturates while providing a gain of typically > 106. The pulse out of the detector is used to stop a 500 MHz digital clock register integrated within the focal-plane array at each pixel. Using the detector in this binary response mode simplifies the signal processing by eliminating the need for analog-to-digital converters and non-linearity corrections. With appropriate optics, the 32x32 array of digital time values represents a 3-D spatial image frame of the scene. Successive image frames illuminated with the multi-kilohertz pulse repetition rate laser are accumulated into range histograms to provide 3-D volume and intensity information. In this article, we describe the Jigsaw program goals, our demonstration sensor system, the data collection campaigns, and show examples of 3-D imaging with foliage and camouflage penetration. Other applications for this 3-D imaging direct-detection ladar technology include robotic vision, avigation of autonomous vehicles, manufacturing quality control, industrial security, and topography.
MIT Lincoln Laboratory continues the development of novel high-resolution 3D imaging laser radar technology and sensor systems. The sensor system described in detail here uses a passively Q-switched solid-state frequency-doubled Nd:YAG laser to transmit short laser pulses (~ 700 ps FWHM) at 532 nm wavelength and derive the range
to target surface element by measuring the time-of-flight for each pixel. The single photoelectron detection efficiency has been measured to be > 20 % using these Silicon Geiger-mode APDs at room temperature. The pulse out of the detector is used to stop a > 500 MHz digital clock integrated within the focal-plane array. With
appropriate optics, the 32x32 array of digital time values represents a 3D spatial image frame of the scene. Successive image frames from the multi-kilohertz pulse repetition rate laser pulses are accumulated into range histograms to provide 3D volume and intensity information.
In this paper, we report on a prototype sensor system, which has recently been developed using new 32x32
arrays of Geiger-mode APDs with 0.35 μm CMOS digital timing circuits at each pixel. Here we describe the
sensor system development and present recent measurements of laboratory test data and field imagery.