From Event: SPIE Defense + Commercial Sensing, 2023
This paper discusses how Augmented Reality (AR) and game engines can be utilized to increase situational awareness both within and beyond the battlefield. It presents a combination of software and hardware that allow for forward deployed operational forces and ground force commanders to automatically collect, share, and process data within their environment. The sharing of data produces a virtual representation of a real environment between multiple users, so that a common operating picture can be generated and processed. The use of the AR headset not only provides a way to present the data in a heads-up, eyes-out approach, but allows for interaction within the synthetic environment. Through this virtual interaction, the troops are able to annotate areas and people, improving non-verbal communication among a group. Areas can be marked as a destination and people can be marked as enemies. The annotations and virtualized environmental information can then be processed together to determine lines of sight of enemies and paths of least visibility to a destination point for all individuals wearing the AR devices. This processed data is displayed back to the troops so that they can make quick and informed decisions.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Emily Strube and Kayla Oates, "Seeing through the enemy's eyes," Proc. SPIE 12538, Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications V, 125380T (Presented at SPIE Defense + Commercial Sensing: May 02, 2023; Published: 12 June 2023); https://doi.org/10.1117/12.2659166.