Belly of the Beast
Jacob Hedges, Daniel Garrett, and Jaxon Sharp
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2025
- Location: Canberra, Australia
- Track: Music
- Pages: 60–62
- Article Number: 16
- DOI: 10.5281/zenodo.17801073 (Link to paper and supplementary files)
- PDF Link
- Supplementary File 1: nime2025_music_16_file01.mp4
Abstract
Belly of the Beast is an interactive music VR experience that allows users to dynamically manipulate a spatial musical composition in real time by moving sounds in 3D space, rearranging the structure of the composition, and playing virtual instruments along with the music. Leveraging the unique interactions of hand tracking and head tracking with the Meta Quest 3, This work harnesses XR technologies to reimagine the way we interact with music, allowing users to act as both producer, performer, and audience at once. Built using Unity and Wwise, and drawing inspiration from adaptive game music, the project empowers participants to shape the composition in real time through their movements and gestures. The composition presented is a 5-10 minute, semi-linear experience, where the user’s interactions progress the unfolding of the composition. It is a one headset per user experience, designed to be playable in a 4m x 4m square. The intended installation does not require extensive setup, as it operates with the Meta Quest 3 tethered to a Windows computer running the experience software.
Citation
Jacob Hedges, Daniel Garrett, and Jaxon Sharp. 2025. Belly of the Beast. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.17801073 [PDF]
BibTeX Entry
@inproceedings{nime2025_music_16,
abstract = {Belly of the Beast is an interactive music VR experience that allows users to dynamically manipulate a spatial musical composition in real time by moving sounds in 3D space, rearranging the structure of the composition, and playing virtual instruments along with the music. Leveraging the unique interactions of hand tracking and head tracking with the Meta Quest 3, This work harnesses XR technologies to reimagine the way we interact with music, allowing users to act as both producer, performer, and audience at once. Built using Unity and Wwise, and drawing inspiration from adaptive game music, the project empowers participants to shape the composition in real time through their movements and gestures. The composition presented is a 5-10 minute, semi-linear experience, where the user’s interactions progress the unfolding of the composition. It is a one headset per user experience, designed to be playable in a 4m x 4m square. The intended installation does not require extensive setup, as it operates with the Meta Quest 3 tethered to a Windows computer running the experience software.},
address = {Canberra, Australia},
articleno = {16},
author = {Jacob Hedges and Daniel Garrett and Jaxon Sharp},
booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
doi = {10.5281/zenodo.17801073},
editor = {Sophie Rose and Jos Mulder and Nicole Carroll},
issn = {2220-4806},
month = {June},
note = {Installation},
numpages = {3},
pages = {60--62},
title = {Belly of the Beast},
track = {Music},
url = {http://nime.org/proceedings/2025/nime2025_music_16.pdf},
urlsuppl1 = {http://nime.org/proceedings/2025/nime2025_music_16_file01.mp4},
year = {2025}
}