Expressive Control of Indirect Augmented Reality During Live Music Performances

Lode Hoste, and Beat Signer

Proceedings of the International Conference on New Interfaces for Musical Expression

  • Year: 2013
  • Location: Daejeon, Republic of Korea
  • Pages: 13–18
  • Keywords: Expressive control, augmented reality, live music performance, 3D gesture recognition, Kinect, declarative language
  • DOI: 10.5281/zenodo.1178558 (Link to paper)
  • PDF link

Abstract:

Nowadays many music artists rely on visualisations and light shows to enhanceand augment their live performances. However, the visualisation and triggeringof lights is normally scripted in advance and synchronised with the concert,severely limiting the artist's freedom for improvisation, expression and ad-hocadaptation of their show. These scripts result in performances where thetechnology enforces the artist and their music to stay in synchronisation withthe pre-programmed environment. We argue that these limitations can be overcomebased on emerging non-invasive tracking technologies in combination with anadvanced gesture recognition engine.We present a solution that uses explicit gestures and implicit dance moves tocontrol the visual augmentation of a live music performance. We furtherillustrate how our framework overcomes existing limitations of gestureclassification systems by delivering a precise recognition solution based on asingle gesture sample in combination with expert knowledge. The presentedsolution enables a more dynamic and spontaneous performance and, when combinedwith indirect augmented reality, results in a more intense interaction betweenthe artist and their audience.

Citation:

Lode Hoste, and Beat Signer. 2013. Expressive Control of Indirect Augmented Reality During Live Music Performances. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178558

BibTeX Entry:

  @inproceedings{Hoste2013,
 abstract = {Nowadays many music artists rely on visualisations and light shows to enhanceand augment their live performances. However, the visualisation and triggeringof lights is normally scripted in advance and synchronised with the concert,severely limiting the artist's freedom for improvisation, expression and ad-hocadaptation of their show. These scripts result in performances where thetechnology enforces the artist and their music to stay in synchronisation withthe pre-programmed environment. We argue that these limitations can be overcomebased on emerging non-invasive tracking technologies in combination with anadvanced gesture recognition engine.We present a solution that uses explicit gestures and implicit dance moves tocontrol the visual augmentation of a live music performance. We furtherillustrate how our framework overcomes existing limitations of gestureclassification systems by delivering a precise recognition solution based on asingle gesture sample in combination with expert knowledge. The presentedsolution enables a more dynamic and spontaneous performance and, when combinedwith indirect augmented reality, results in a more intense interaction betweenthe artist and their audience.},
 address = {Daejeon, Republic of Korea},
 author = {Lode Hoste and Beat Signer},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178558},
 issn = {2220-4806},
 keywords = {Expressive control, augmented reality, live music performance, 3D gesture recognition, Kinect, declarative language},
 month = {May},
 pages = {13--18},
 publisher = {Graduate School of Culture Technology, KAIST},
 title = {Expressive Control of Indirect Augmented Reality During Live Music Performances},
 url = {http://www.nime.org/proceedings/2013/nime2013_32.pdf},
 year = {2013}
}