PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following

Alejandro Van Zandt-Escobar, Baptiste Caramiaux, and Atau Tanaka

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

When performing a piece, a pianist's interpretation is communicated both through the sound produced and through body gestures. We present PiaF (Piano Follower), a prototype for augmenting piano performance by measuring gesture variations. We survey other augmented piano projects, several of which focus on gestural recognition, and present our prototype which uses machine learning techniques for gesture classification and estimation of gesture variations in real-time. Our implementation uses the Kinect depth sensor to track body motion in space, which is used as input data. During an initial learning phase, the system is taught a set of reference gestures, or templates. During performance, the live gesture is classified in real-time, and variations with respect to the recognized template are computed. These values can then be mapped to audio processing parameters, to control digital effects which are applied to the acoustic output of the piano in real-time. We discuss initial tests using PiaF with a pianist, as well as potential applications beyond live performance, including pedagogy and embodiment of recorded performance.

Citation:

Alejandro Van Zandt-Escobar, Baptiste Caramiaux, and Atau Tanaka. 2014. PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178991

BibTeX Entry:

  @inproceedings{avanzandt2014,
 abstract = {When performing a piece, a pianist's interpretation is communicated both through the sound produced and through body gestures. We present PiaF (Piano Follower), a prototype for augmenting piano performance by measuring gesture variations. We survey other augmented piano projects, several of which focus on gestural recognition, and present our prototype which uses machine learning techniques for gesture classification and estimation of gesture variations in real-time. Our implementation uses the Kinect depth sensor to track body motion in space, which is used as input data. During an initial learning phase, the system is taught a set of reference gestures, or templates. During performance, the live gesture is classified in real-time, and variations with respect to the recognized template are computed. These values can then be mapped to audio processing parameters, to control digital effects which are applied to the acoustic output of the piano in real-time. We discuss initial tests using PiaF with a pianist, as well as potential applications beyond live performance, including pedagogy and embodiment of recorded performance.},
 address = {London, United Kingdom},
 author = {Alejandro Van Zandt-Escobar and Baptiste Caramiaux and Atau Tanaka},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178991},
 issn = {2220-4806},
 month = {June},
 pages = {167--170},
 publisher = {Goldsmiths, University of London},
 title = {PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following},
 url = {http://www.nime.org/proceedings/2014/nime2014_511.pdf},
 year = {2014}
}