BRAAHMS: A Novel Adaptive Musical Interface Based on Users' Cognitive State

Beste Filiz Yuksel, Daniel Afergan, Evan Peck, Garth Griffin, Lane Harrison, Nick Chen, Remco Chang, and Robert Jacob

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

We present a novel brain-computer interface (BCI) integrated with a musical instrument that adapts implicitly (with no extra effort from user) to users' changing cognitive state during musical improvisation. Most previous musical BCI systems use either a mapping of brainwaves to create audio signals or use explicit brain signals to control some aspect of the music. Such systems do not take advantage of higher level semantically meaningful brain data which could be used in adaptive systems or without detracting from the attention of the user. We present a new type of real-time BCI that assists users in musical improvisation by adapting to users' measured cognitive workload implicitly. Our system advances the state of the art in this area in three ways: 1) We demonstrate that cognitive workload can be classified in real-time while users play the piano using functional near-infrared spectroscopy. 2) We build a real-time, implicit system using this brain signal that musically adapts to what users are playing. 3) We demonstrate that users prefer this novel musical instrument over other conditions and report that they feel more creative.

Citation:

Beste Filiz Yuksel, Daniel Afergan, Evan Peck, Garth Griffin, Lane Harrison, Nick Chen, Remco Chang, and Robert Jacob. 2015. BRAAHMS: A Novel Adaptive Musical Interface Based on Users' Cognitive State. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1181418

BibTeX Entry:

  @inproceedings{byuksel2015,
 abstract = {We present a novel brain-computer interface (BCI) integrated with a musical instrument that adapts implicitly (with no extra effort from user) to users' changing cognitive state during musical improvisation. Most previous musical BCI systems use either a mapping of brainwaves to create audio signals or use explicit brain signals to control some aspect of the music. Such systems do not take advantage of higher level semantically meaningful brain data which could be used in adaptive systems or without detracting from the attention of the user. We present a new type of real-time BCI that assists users in musical improvisation by adapting to users' measured cognitive workload implicitly. Our system advances the state of the art in this area in three ways: 1) We demonstrate that cognitive workload can be classified in real-time while users play the piano using functional near-infrared spectroscopy. 2) We build a real-time, implicit system using this brain signal that musically adapts to what users are playing. 3) We demonstrate that users prefer this novel musical instrument over other conditions and report that they feel more creative.},
 address = {Baton Rouge, Louisiana, USA},
 author = {{Beste Filiz} Yuksel and Daniel Afergan and Evan Peck and Garth Griffin and Lane Harrison and Nick Chen and Remco Chang and Robert Jacob},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1181418},
 editor = {Edgar Berdahl and Jesse Allison},
 issn = {2220-4806},
 month = {May},
 pages = {136--139},
 publisher = {Louisiana State University},
 title = {BRAAHMS: A Novel Adaptive Musical Interface Based on Users' Cognitive State},
 url = {http://www.nime.org/proceedings/2015/nime2015_243.pdf},
 urlsuppl1 = {http://www.nime.org/proceedings/2015/243/0243-file1.mp4},
 year = {2015}
}