1 August 1991 Conversion of sensor data for real-time scene generation
Author Affiliations +
In order to perform real-time signal processing and data analysis of fused sensor data and at the same time optimize the use of existing hardware, the scene information most often has to be converted into a particular format. In general, this format conversion is viewed as part of the sensor fusion process, but in this paper it will be treated as a separate entity. In other words, the concentration is on building a representation of the environment which lends itself directly to real-time true three-dimensional processing of the environment with higher-level path planning in mind. An example scenario is using the data as input for terrain-following and terrain-avoidance algorithms, where the output from the sensor data processing is a world model that directly applies to intersect analysis and evaluation. The intersect processing is performed in a hardware unit called the TIGER (three-dimensional intersect & geometrical evaluator in real time). The TIGER is based on VHSIC (very-high-speed integrated-circuit) technology, and performs intersect calculations at rates of the order of millions of objects/sec using a particular three-dimensional object format. This hardware subsystem is designed to be useful for a wide range of airborne, underwater and space applications. In order to address a broad area of sensor types, the architecture is made generic, and has potential applications in solving selected sensor-fusion computational bottlenecks as well. Standard interfaces simplify subsystem coupling to a variety of host processor systems. The TIGER hardware has been built and tested extensively.
© (1991) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Vibeke Libby, Vibeke Libby, R. Keith Bardin, R. Keith Bardin, } "Conversion of sensor data for real-time scene generation", Proc. SPIE 1470, Data Structures and Target Classification, (1 August 1991); doi: 10.1117/12.28805; https://doi.org/10.1117/12.28805


Back to Top