An Experimental Set of Hand Gestures for Expressive Control of Musical Parameters in Realtime

Paul Modler, Tony Myatt, and Michael Saup

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

This paper describes the implementation of Time Delay NeuralNetworks (TDNN) to recognize gestures from video images.Video sources are used because they are non-invasive and do notinhibit performer's physical movement or require specialistdevices to be attached to the performer which experience hasshown to be a significant problem that impacts musiciansperformance and can focus musical rehearsals and performancesupon technical rather than musical concerns (Myatt 2003).We describe a set of hand gestures learned by an artificial neuralnetwork to control musical parameters expressively in real time.The set is made up of different types of gestures in order toinvestigate:-aspects of the recognition process-expressive musical control-schemes of parameter mapping-generalization issues for an extended set for musicalcontrolThe learning procedure of the Neural Network is describedwhich is based on variations by affine transformations of imagesequences of the hand gestures.The whole application including the gesture capturing isimplemented in jMax to achieve real time conditions and easyintegration into a musical environment to realize differentmappings and routings of the control stream.The system represents a practice-based research using actualmusic models like compositions and processes of compositionwhich will follow the work described in the paper.

Citation:

Paul Modler, Tony Myatt, and Michael Saup. 2003. An Experimental Set of Hand Gestures for Expressive Control of Musical Parameters in Realtime. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1176533

BibTeX Entry:

  @inproceedings{Modler2003,
 abstract = {This paper describes the implementation of Time Delay NeuralNetworks (TDNN) to recognize gestures from video images.Video sources are used because they are non-invasive and do notinhibit performer's physical movement or require specialistdevices to be attached to the performer which experience hasshown to be a significant problem that impacts musiciansperformance and can focus musical rehearsals and performancesupon technical rather than musical concerns (Myatt 2003).We describe a set of hand gestures learned by an artificial neuralnetwork to control musical parameters expressively in real time.The set is made up of different types of gestures in order toinvestigate:-aspects of the recognition process-expressive musical control-schemes of parameter mapping-generalization issues for an extended set for musicalcontrolThe learning procedure of the Neural Network is describedwhich is based on variations by affine transformations of imagesequences of the hand gestures.The whole application including the gesture capturing isimplemented in jMax to achieve real time conditions and easyintegration into a musical environment to realize differentmappings and routings of the control stream.The system represents a practice-based research using actualmusic models like compositions and processes of compositionwhich will follow the work described in the paper.},
 address = {Montreal, Canada},
 author = {Modler, Paul and Myatt, Tony and Saup, Michael},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 date = {22-24 May, 2003},
 doi = {10.5281/zenodo.1176533},
 issn = {2220-4806},
 keywords = {Gesture Recognition, Artificial Neural Network, Expressive Control, Real-time Interaction },
 pages = {146--150},
 title = {An Experimental Set of Hand Gestures for Expressive Control of Musical Parameters in Realtime},
 url = {http://www.nime.org/proceedings/2003/nime2003_146.pdf},
 year = {2003}
}