15 April 1994 Gestural interaction in a virtual environment
Author Affiliations +
Abstract
This paper discusses the use of hand gestures (i.e., changing finger flexion) within a virtual environment (VE). Many systems now employ static hand postures (i.e., static finger flexion), often coupled with hand translations and rotations, as a method of interacting with a VE. However, few systems are currently using dynamically changing finger flexion for interacting with VEs. In our system, the user wears an electronically instrumented glove. We have developed a simple algorithm for recognizing gestures for use in two applications: automotive design and visualization of atmospheric data. In addition to recognizing the gestures, we also calculate the rate at which the gestures are made and the rate and direction of hand movement while making the gestures. We report on our experiences with the algorithm design and implementation, and the use of the gestures in our applications. We also talk about our background work in user calibration of the glove, as well as learned and innate posture recognition (postures recognized with and without training, respectively).
© (1994) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Richard H. Jacoby, Mark Ferneau, Jim Humphries, "Gestural interaction in a virtual environment", Proc. SPIE 2177, Stereoscopic Displays and Virtual Reality Systems, (15 April 1994); doi: 10.1117/12.173892; https://doi.org/10.1117/12.173892
PROCEEDINGS
10 PAGES


SHARE
Back to Top