AirStream: A Collaborative Gestural Virtual Performance
Alon Ilsar, and Matthew Hughes
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2021
- Location: Shanghai, China
- Track: Music
- Article Number: 14
- DOI: 10.21428/92fbeb44.84c23721 (Link to paper and supplementary files)
- PDF Link
Abstract
For this collaborative performance, two musicians will improvise together over the internet using custom gestural controllers, the AirSticks. The AirSticks utilise off-the-shelf VR controllers and bespoke software to trigger and manipulate sound and graphics through hand movements. 3D point clouds — captured using commodity depth sensors — are streamed in real-time into a shared virtual stage. The performer’s gestures create and manipulate the audio-visual environment, as the ‘VJ’ curates the audience’s porthole into the space. With the rise of online musical experiences over Zoom, this performance brings new 3D flavour for both the musicians and the audience. Audio and graphic latency is reduced by sending MIDI and OSC data over the internet, rendering the sound on each end, while images streamed from the depth cameras utilise state-of-the-art compression algorithms. Future work will be dedicated to allowing the audience to enter the virtual space using VR or AR.
Citation
Alon Ilsar, and Matthew Hughes. 2021. AirStream: A Collaborative Gestural Virtual Performance. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.21428/92fbeb44.84c23721 [PDF]
BibTeX Entry
@inproceedings{nime2021_music_14,
abstract = {For this collaborative performance, two musicians will improvise together over the internet using custom gestural controllers, the AirSticks. The AirSticks utilise off-the-shelf VR controllers and bespoke software to trigger and manipulate sound and graphics through hand movements. 3D point clouds — captured using commodity depth sensors — are streamed in real-time into a shared virtual stage. The performer’s gestures create and manipulate the audio-visual environment, as the ‘VJ’ curates the audience’s porthole into the space. With the rise of online musical experiences over Zoom, this performance brings new 3D flavour for both the musicians and the audience. Audio and graphic latency is reduced by sending MIDI and OSC data over the internet, rendering the sound on each end, while images streamed from the depth cameras utilise state-of-the-art compression algorithms. Future work will be dedicated to allowing the audience to enter the virtual space using VR or AR.},
address = {Shanghai, China},
articleno = {14},
author = {Alon Ilsar and Matthew Hughes},
booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
doi = {10.21428/92fbeb44.84c23721},
editor = {Eric Parren and Wei Chen},
issn = {2220-4806},
month = {June},
title = {AirStream: A Collaborative Gestural Virtual Performance},
track = {Music},
url = {https://doi.org/10.21428/92fbeb44.84c23721},
year = {2021}
}