20 August 1992 Neural networks for simultaneous classification and parameter estimation in musical instrument control
Author Affiliations +
Abstract
In this report we present our tools for prototyping adaptive user interfaces in the context of real-time musical instrument control. Characteristic of most human communication is the simultaneous use of classified events and estimated parameters. We have integrated a neural network object into the MAX language to explore adaptive user interfaces that considers these facets of human communication. By placing the neural processing in the context of a flexible real-time musical programming environment, we can rapidly prototype experiments on applications of adaptive interfaces and learning systems to musical problems. We have trained networks to recognize gestures from a Mathews radio baton, Nintendo Power GloveTM, and MIDI keyboard gestural input devices. In one experiment, a network successfully extracted classification and attribute data from gestural contours transduced by a continuous space controller, suggesting their application in the interpretation of conducting gestures and musical instrument control. We discuss network architectures, low-level features extracted for the networks to operate on, training methods, and musical applications of adaptive techniques.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Michael Lee, Michael Lee, Adrian Freed, Adrian Freed, David Wessel, David Wessel, } "Neural networks for simultaneous classification and parameter estimation in musical instrument control", Proc. SPIE 1706, Adaptive and Learning Systems, (20 August 1992); doi: 10.1117/12.139949; https://doi.org/10.1117/12.139949
PROCEEDINGS
12 PAGES


SHARE
Back to Top