28 February 2014 Game engines and immersive displays
Author Affiliations +
Abstract
While virtual reality and digital games share many core technologies, the programming environments, toolkits, and workflows for developing games and VR environments are often distinct. VR toolkits designed for applications in visualization and simulation often have a different feature set or design philosophy than game engines, while popular game engines often lack support for VR hardware. Extending a game engine to support systems such as the CAVE gives developers a unified development environment and the ability to easily port projects, but involves challenges beyond just adding stereo 3D visuals. In this paper we outline the issues involved in adapting a game engine for use with an immersive display system including stereoscopy, tracking, and clustering, and present example implementation details using Unity3D. We discuss application development and workflow approaches including camera management, rendering synchronization, GUI design, and issues specific to Unity3D, and present examples of projects created for a multi-wall, clustered, stereoscopic display.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Benjamin Chang, Benjamin Chang, Marc Destefano, Marc Destefano, } "Game engines and immersive displays", Proc. SPIE 9012, The Engineering Reality of Virtual Reality 2014, 90120G (28 February 2014); doi: 10.1117/12.2042626; https://doi.org/10.1117/12.2042626
PROCEEDINGS
10 PAGES


SHARE
Back to Top