Toward an Emotionally Intelligent Piano: Real-Time Emotion Detection and Performer Feedback via Kinesthetic Sensing in Piano Performance

Matan Ben-Asher, and Colby Leider

Proceedings of the International Conference on New Interfaces for Musical Expression

  • Year: 2013
  • Location: Daejeon, Republic of Korea
  • Pages: 21–24
  • Keywords: Motion Sensors, IMUs, Expressive Piano Performance, Machine Learning, Computer Music, Music and Emotion
  • DOI: 10.5281/zenodo.1178474 (Link to paper)
  • PDF link

Abstract:

A system is presented for detecting common gestures, musical intentions andemotions of pianists in real-time using only kinesthetic data retrieved bywireless motion sensors. The algorithm can detect common Western musicalstructures such as chords, arpeggios, scales, and trills as well as musicallyintended emotions: cheerful, mournful, vigorous, dreamy, lyrical, and humorouscompletely and solely based on low-sample-rate motion sensor data. Thealgorithm can be trained per performer in real-time or can work based onprevious training sets. The system maps the emotions to a color set andpresents them as a flowing emotional spectrum on the background of a pianoroll. This acts as a feedback mechanism for emotional expression as well as aninteractive display of the music. The system was trained and tested on a numberof pianists and it classified structures and emotions with promising results ofup to 92% accuracy.

Citation:

Matan Ben-Asher, and Colby Leider. 2013. Toward an Emotionally Intelligent Piano: Real-Time Emotion Detection and Performer Feedback via Kinesthetic Sensing in Piano Performance. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178474

BibTeX Entry:

  @inproceedings{BenAsher2013,
 abstract = {A system is presented for detecting common gestures, musical intentions andemotions of pianists in real-time using only kinesthetic data retrieved bywireless motion sensors. The algorithm can detect common Western musicalstructures such as chords, arpeggios, scales, and trills as well as musicallyintended emotions: cheerful, mournful, vigorous, dreamy, lyrical, and humorouscompletely and solely based on low-sample-rate motion sensor data. Thealgorithm can be trained per performer in real-time or can work based onprevious training sets. The system maps the emotions to a color set andpresents them as a flowing emotional spectrum on the background of a pianoroll. This acts as a feedback mechanism for emotional expression as well as aninteractive display of the music. The system was trained and tested on a numberof pianists and it classified structures and emotions with promising results ofup to 92\% accuracy.},
 address = {Daejeon, Republic of Korea},
 author = {Matan Ben-Asher and Colby Leider},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178474},
 issn = {2220-4806},
 keywords = {Motion Sensors, IMUs, Expressive Piano Performance, Machine Learning, Computer Music, Music and Emotion},
 month = {May},
 pages = {21--24},
 publisher = {Graduate School of Culture Technology, KAIST},
 title = {Toward an Emotionally Intelligent Piano: Real-Time Emotion Detection and Performer Feedback via Kinesthetic Sensing in Piano Performance},
 url = {http://www.nime.org/proceedings/2013/nime2013_48.pdf},
 year = {2013}
}