8 March 2012 Acoustic-tactile rendering of visual information
Author Affiliations +
Abstract
In previous work, we have proposed a dynamic, interactive system for conveying visual information via hearing and touch. The system is implemented with a touch screen that allows the user to interrogate a two-dimensional (2-D) object layout by active finger scanning while listening to spatialized auditory feedback. Sound is used as the primary source of information for object localization and identification, while touch is used both for pointing and for kinesthetic feedback. Our previous work considered shape and size perception of simple objects via hearing and touch. The focus of this paper is on the perception of a 2-D layout of simple objects with identical size and shape. We consider the selection and rendition of sounds for object identification and localization. We rely on the head-related transfer function for rendering sound directionality, and consider variations of sound intensity and tempo as two alternative approaches for rendering proximity. Subjective experiments with visually-blocked subjects are used to evaluate the effectiveness of the proposed approaches. Our results indicate that intensity outperforms tempo as a proximity cue, and that the overall system for conveying a 2-D layout is quite promising.
© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Pubudu Madhawa Silva, Pubudu Madhawa Silva, Thrasyvoulos N. Pappas, Thrasyvoulos N. Pappas, Joshua Atkins, Joshua Atkins, James E. West, James E. West, William M. Hartmann, William M. Hartmann, } "Acoustic-tactile rendering of visual information", Proc. SPIE 8291, Human Vision and Electronic Imaging XVII, 82910M (8 March 2012); doi: 10.1117/12.916166; https://doi.org/10.1117/12.916166
PROCEEDINGS
12 PAGES


SHARE
Back to Top