1 September 2002 Multi-sensor fusion over the World Trade Center disaster site
Author Affiliations +
Abstract
The immense size and scope of the rescue and clean-up of the World Trade Center site created a need for data that would provide a total overview of the disaster area. To fulfill this need, the New York State Office for Technology (NYSOFT) contracted with EarthData International to collect airborne remote sensing data over Ground Zero with an airborne light detection and ranging (LIDAR) sensor, a high-resolution digital camera, and a thermal camera. The LIDAR data provided a threedimensional elevation model of the ground surface that was used for volumetric calculations and also in the orthorectification of the digital images. The digital camera provided high-resolution imagery over the site to aide the rescuers in placement of equipment and other assets. In addition, the digital imagery was used to georeference the thermal imagery and also provided the visual background for the thermal data. The thermal camera aided in the location and tracking of underground fires. The combination of data from these three sensors provided the emergency crews with a timely, accurate overview containing a wealth of information of the rapidly changing disaster site. Because of the dynamic nature of the site, the data was acquired on a daily basis, processed, and turned over to NYSOFT within twelve hours of the collection. During processing, the three datasets were combined and georeferenced to allow them to be inserted into the client's geographic information systems.
© (2002) Society of Photo-Optical Instrumentation Engineers (SPIE)
Craig Rodarmel, Craig Rodarmel, Lawrence Scott, Lawrence Scott, Deborah A. Simerlink, Deborah A. Simerlink, Jeffrey Walker, Jeffrey Walker, } "Multi-sensor fusion over the World Trade Center disaster site," Optical Engineering 41(9), (1 September 2002). https://doi.org/10.1117/1.1497984 . Submission:
JOURNAL ARTICLE
9 PAGES


SHARE
Back to Top