EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control
João Coimbra, Luís Aly, Henrique Portovedo, Sara Carvalho, and Tiago Bolaños
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2025
- Location: Canberra, Australia
- Track: Paper
- Pages: 250–254
- Article Number: 35
- DOI: 10.5281/zenodo.15698847 (Link to paper and supplementary files)
- PDF Link
- Video
Abstract
This paper presents the Electromyographic Music Avatar (EMMA), a digital musical instrument (DMI) designed to enhance real-time sound-based composition through gestural control. Developed as part of a doctoral research project, EMMA combines electromyography (EMG) and motion sensors to capture nuanced finger, hand, and arm movements, treating each finger as an independent instrument. This approach bridges embodied performance with computational sound generation, enabling expressive and intuitive interaction. The system features a glove-based design with EMG sensors for each finger and motion detection for the wrist and arm, allowing seamless control of musical parameters. By addressing key challenges in DMI design, such as action-sound immediacy and performer-instrument dynamics, EMMA contributes to developing expressive and adaptable tools for contemporary music-making.
Citation
João Coimbra, Luís Aly, Henrique Portovedo, Sara Carvalho, and Tiago Bolaños. 2025. EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.15698847 [PDF]
BibTeX Entry
@article{nime2025_35, abstract = {This paper presents the Electromyographic Music Avatar (EMMA), a digital musical instrument (DMI) designed to enhance real-time sound-based composition through gestural control. Developed as part of a doctoral research project, EMMA combines electromyography (EMG) and motion sensors to capture nuanced finger, hand, and arm movements, treating each finger as an independent instrument. This approach bridges embodied performance with computational sound generation, enabling expressive and intuitive interaction. The system features a glove-based design with EMG sensors for each finger and motion detection for the wrist and arm, allowing seamless control of musical parameters. By addressing key challenges in DMI design, such as action-sound immediacy and performer-instrument dynamics, EMMA contributes to developing expressive and adaptable tools for contemporary music-making.}, address = {Canberra, Australia}, articleno = {35}, author = {João Coimbra and Luís Aly and Henrique Portovedo and Sara Carvalho and Tiago Bolaños}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.15698847}, editor = {Doga Cavdir and Florent Berthaut}, issn = {2220-4806}, month = {June}, numpages = {5}, pages = {250--254}, presentation-video = {https://youtu.be/Hh3rGWhMHI0}, title = {EMMA: Enhancing Real-Time Musical Expression through Electromyographic Control}, track = {Paper}, url = {http://nime.org/proceedings/2025/nime2025_35.pdf}, year = {2025} }