Generating an Integrated Musical Expression with a Brain--Computer Interface

Takayuki Hamano, Tomasz Rutkowski, Hiroko Terasawa, Kazuo Okanoya, and Kiyoshi Furukawa

Proceedings of the International Conference on New Interfaces for Musical Expression

  • Year: 2013
  • Location: Daejeon, Republic of Korea
  • Pages: 49–54
  • Keywords: Brain-computer interface (BCI), qualitative and quantitative information, classification, sonification
  • DOI: 10.5281/zenodo.1178542 (Link to paper)
  • PDF link

Abstract:

Electroencephalography (EEG) has been used to generate music for over 40 years,but the most recent developments in brain--computer interfaces (BCI) allowgreater control and more flexible expression for using new musical instrumentswith EEG. We developed a real-time musical performance system using BCItechnology and sonification techniques to generate imagined musical chords withorganically fluctuating timbre. We aim to emulate the expressivity oftraditional acoustic instruments. The BCI part of the system extracts patternsfrom the neural activity while a performer imagines a score of music. Thesonification part of the system captures non-stationary changes in the brainwaves and reflects them in the timbre by additive synthesis. In this paper, wediscuss the conceptual design, system development, and the performance of thisinstrument.

Citation:

Takayuki Hamano, Tomasz Rutkowski, Hiroko Terasawa, Kazuo Okanoya, and Kiyoshi Furukawa. 2013. Generating an Integrated Musical Expression with a Brain--Computer Interface. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178542

BibTeX Entry:

  @inproceedings{Hamano2013,
 abstract = {Electroencephalography (EEG) has been used to generate music for over 40 years,but the most recent developments in brain--computer interfaces (BCI) allowgreater control and more flexible expression for using new musical instrumentswith EEG. We developed a real-time musical performance system using BCItechnology and sonification techniques to generate imagined musical chords withorganically fluctuating timbre. We aim to emulate the expressivity oftraditional acoustic instruments. The BCI part of the system extracts patternsfrom the neural activity while a performer imagines a score of music. Thesonification part of the system captures non-stationary changes in the brainwaves and reflects them in the timbre by additive synthesis. In this paper, wediscuss the conceptual design, system development, and the performance of thisinstrument.},
 address = {Daejeon, Republic of Korea},
 author = {Takayuki Hamano and Tomasz Rutkowski and Hiroko Terasawa and Kazuo Okanoya and Kiyoshi Furukawa},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178542},
 issn = {2220-4806},
 keywords = {Brain-computer interface (BCI), qualitative and quantitative information, classification, sonification},
 month = {May},
 pages = {49--54},
 publisher = {Graduate School of Culture Technology, KAIST},
 title = {Generating an Integrated Musical Expression with a Brain--Computer Interface},
 url = {http://www.nime.org/proceedings/2013/nime2013_120.pdf},
 year = {2013}
}