Translator Disclaimer
29 January 2007 Multimodal event streams for virtual reality
Author Affiliations +
Proceedings Volume 6504, Multimedia Computing and Networking 2007; 65040M (2007)
Event: Electronic Imaging 2007, 2007, San Jose, CA, United States
Applications in the fields of virtual and augmented reality as well as image-guided medical applications make use of a wide variety of hardware devices. Existing frameworks for interconnecting low-level devices and high-level application programs do not exploit the full potential for processing events coming from arbitrary sources and are not easily generalizable. In this paper, we will introduce a new multi-modal event processing methodology using dynamically-typed event attributes for event passing between multiple devices and systems. The existing OpenTracker framework was modified to incorporate a highly flexible and extensible event model, which can store data that is dynamically created and arbitrarily typed at runtime. The main factors impacting the library's throughput were determined and the performance was shown to be sufficient for most typical applications. Several sample applications were developed to take advantage of the new dynamic event model provided by the library, thereby demonstrating its flexibility and expressive power.
© (2007) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
J. von Spiczak, E. Samset, S. DiMaio, G. Reitmayr, D. Schmalstieg, C. Burghart, and R. Kikinis "Multimodal event streams for virtual reality", Proc. SPIE 6504, Multimedia Computing and Networking 2007, 65040M (29 January 2007);

Back to Top