Listening to Your Brain: Implicit Interaction in Collaborative Music Performances

Sebastián Mealla, Aleksander Väaljamäae, Mathieu Bosi, and Sergi Jordà

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

The use of physiological signals in Human Computer Interaction (HCI) is becoming popular and widespread, mostly due to sensors miniaturization and advances in real-time processing. However, most of the studies that use physiology based interaction focus on single-user paradigms, and its usage in collaborative scenarios is still in its beginning. In this paper we explore how interactive sonification of brain and heart signals, and its representation through physical objects (physiopucks) in a tabletop interface may enhance motivational and controlling aspects of music collaboration. A multimodal system is presented, based on an electrophysiology sensor system and the Reactable, a musical tabletop interface. Performance and motivation variables were assessed in an experiment involving a test "Physio" group(N=22) and a control "Placebo" group (N=10). Pairs of participants used two methods for sound creation: implicit interaction through physiological signals, and explicit interaction by means of gestural manipulation. The results showed that pairs in the Physio Group declared less difficulty, higher confidence and more symmetric control than the Placebo Group, where no real-time sonification was provided as subjects were using pre-recorded physiological signal being unaware of it. These results support the feasibility of introducing physiology-based interaction in multimodal interfaces for collaborative music generation.

Citation:

Sebastián Mealla, Aleksander Väaljamäae, Mathieu Bosi, and Sergi Jordà. 2011. Listening to Your Brain: Implicit Interaction in Collaborative Music Performances. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178107

BibTeX Entry:

  @inproceedings{Mealla2011,
 abstract = {The use of physiological signals in Human Computer Interaction (HCI) is becoming popular and widespread, mostly due to sensors miniaturization and advances in real-time processing. However, most of the studies that use physiology based interaction focus on single-user paradigms, and its usage in collaborative scenarios is still in its beginning. In this paper we explore how interactive sonification of brain and heart signals, and its representation through physical objects (physiopucks) in a tabletop interface may enhance motivational and controlling aspects of music collaboration. A multimodal system is presented, based on an electrophysiology sensor system and the Reactable, a musical tabletop interface. Performance and motivation variables were assessed in an experiment involving a test "Physio" group(N=22) and a control "Placebo" group (N=10). Pairs of participants used two methods for sound creation: implicit interaction through physiological signals, and explicit interaction by means of gestural manipulation. The results showed that pairs in the Physio Group declared less difficulty, higher confidence and more symmetric control than the Placebo Group, where no real-time sonification was provided as subjects were using pre-recorded physiological signal being unaware of it. These results support the feasibility of introducing physiology-based interaction in multimodal interfaces for collaborative music generation.},
 address = {Oslo, Norway},
 author = {Mealla, Sebasti\'{a}n and V\''{a}aljam\''{a}ae, Aleksander and Bosi, Mathieu and Jord\`{a}, Sergi},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178107},
 issn = {2220-4806},
 keywords = {bci, collaboration, cscw, hci, multimodal interfaces, music, physiological computing, physiopucks, tabletops, universitat pompeu fabra},
 pages = {149--154},
 presentation-video = {https://vimeo.com/26806576/},
 title = {Listening to Your Brain: Implicit Interaction in Collaborative Music Performances},
 url = {http://www.nime.org/proceedings/2011/nime2011_149.pdf},
 year = {2011}
}