Translator Disclaimer
1 May 2019 Disparate sensor real-time scene fusion for tactical environments
Author Affiliations +
Current degraded visual environment (DVE) solutions primarily support aviation and augment a pilot’s ability to operate in a degraded environment, but the relevancy of information presented to the pilot to safely navigate isn’t necessarily the same format that should be presented to dismounting operators or for mission planners in the command and control center. The need exists for a real-time 3D common operating picture (COP) generating system that can provide enhanced mission planning capabilities as well as real-time status and location of operating forces within that 3D COP. Capabilities and challenges that will be addressed are: 1) real-time processing of all disparate sensor data; 2) database implementation that allows for clients to query the COP for specific users, devices, timeframes, and locations; and 3) representation of 3D COP to command and control elements as well as forward deployed users of HoloLens, flat panel display, and iOS devices. The proposed Real-time Intelligence Fusion Service (RIFS) will operate in real-time by receiving disparate data streams from sensors such as LiDARs, radars, and various localization methods. RIFS will then fuse them to a COP and send the COP to requesting clients. The application of RIFS would allow forward deployed personnel and commanders to maintain a high degree of real-time passive situational awareness in 3D space that would ultimately increase operational tempo and significantly mitigate risk to forward deployed forces.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Matthew B. Selleck, David Burke, and Chase Johnston "Disparate sensor real-time scene fusion for tactical environments", Proc. SPIE 11019, Situation Awareness in Degraded Environments 2019, 110190C (1 May 2019);

Back to Top