Designing Gestures for Continuous Sonic Interaction

Atau Tanaka, Balandino Di Donato, Michael Zbyszynski, and Geert Roks

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

This paper presents a system that allows users to quickly try different ways to train neural networks and temporal modeling techniques to associate arm gestures with time varying sound. We created a software framework for this, and designed three interactive sounds and presented them to participants in a workshop based study. We build upon previous work in sound-tracing and mapping-by-demonstration to ask the participants to design gestures with which to perform the given sounds using a multimodal, inertial measurement (IMU) and muscle sensing (EMG) device. We presented the user with four techniques for associating sensor input to synthesizer parameter output. Two were classical techniques from the literature, and two proposed different ways to capture dynamic gesture in a neural network. These four techniques were: 1.) A Static Position regression training procedure, 2.) A Hidden Markov based temporal modeler, 3.) Whole Gesture capture to a neural network, and 4.) a Windowed method using the position-based procedure on the fly during the performance of a dynamic gesture. Our results show trade-offs between accurate, predictable reproduction of the source sounds and exploration of the gesture-sound space. Several of the users were attracted to our new windowed method for capturing gesture anchor points on the fly as training data for neural network based regression. This paper will be of interest to musicians interested in going from sound design to gesture design and offers a workflow for quickly trying different mapping-by-demonstration techniques.

Citation:

Atau Tanaka, Balandino Di Donato, Michael Zbyszynski, and Geert Roks. 2019. Designing Gestures for Continuous Sonic Interaction. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.3672916

BibTeX Entry:

  @inproceedings{Tanaka2019,
 abstract = {This paper presents a system that allows users to quickly try different ways to train neural networks and temporal modeling techniques to associate arm gestures with time varying sound. We created a software framework for this, and designed three interactive sounds and presented them to participants in a workshop based study. We build upon previous work in sound-tracing and mapping-by-demonstration to ask the participants to design gestures with which to perform the given sounds using a multimodal, inertial measurement (IMU) and muscle sensing (EMG) device. We presented the user with four techniques for associating sensor input to synthesizer parameter output. Two were classical techniques from the literature, and two proposed different ways to capture dynamic gesture in a neural network. These four techniques were: 1.) A Static Position regression training procedure, 2.) A Hidden Markov based temporal modeler, 3.) Whole Gesture capture to a neural network, and 4.) a Windowed method using the position-based procedure on the fly during the performance of a dynamic gesture. Our results show trade-offs between accurate, predictable reproduction of the source sounds and exploration of the gesture-sound space. Several of the users were attracted to our new windowed method for capturing gesture anchor points on the fly as training data for neural network based regression. This paper will be of interest to musicians interested in going from sound design to gesture design and offers a workflow for quickly trying different mapping-by-demonstration techniques.},
 address = {Porto Alegre, Brazil},
 author = {Atau Tanaka and Di Donato, Balandino and Michael Zbyszynski and Geert Roks},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.3672916},
 editor = {Marcelo Queiroz and Anna Xambó Sedó},
 issn = {2220-4806},
 month = {June},
 pages = {180--185},
 publisher = {UFRGS},
 title = {Designing Gestures for Continuous Sonic Interaction},
 url = {http://www.nime.org/proceedings/2019/nime2019_paper036.pdf},
 year = {2019}
}