Reflets: Combining and Revealing Spaces for Musical Performances

Florent Berthaut, Diego Martinez, Martin Hachet, and Sriram Subramanian

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

We present Reflets, a mixed-reality environment for musical performances that allows for freely displaying virtual content on stage, such as 3D virtual musical interfaces or visual augmentations of instruments and performers. It relies on spectators and performers revealing virtual objects by slicing through them with body parts or objects, and on planar slightly reflective transparent panels that combine the stage and audience spaces. In this paper, we describe the approach and implementation challenges of Reflets. We then demonstrate that it matches the requirements of musical performances. It allows for placing virtual content anywhere on large stages, even overlapping with physical elements and provides a consistent rendering of this content for large numbers of spectators. It also preserves non-verbal communication between the audience and the performers, and is inherently engaging for the spectators. We finally show that Reflets opens musical performance opportunities such as augmented interaction between musicians and novel techniques for 3D sound shapes manipulation.

Citation:

Florent Berthaut, Diego Martinez, Martin Hachet, and Sriram Subramanian. 2015. Reflets: Combining and Revealing Spaces for Musical Performances. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1179028

BibTeX Entry:

  @inproceedings{fberthaut2015,
 abstract = {We present Reflets, a mixed-reality environment for musical performances that allows for freely displaying virtual content on stage, such as 3D virtual musical interfaces or visual augmentations of instruments and performers. It relies on spectators and performers revealing virtual objects by slicing through them with body parts or objects, and on planar slightly reflective transparent panels that combine the stage and audience spaces. In this paper, we describe the approach and implementation challenges of Reflets. We then demonstrate that it matches the requirements of musical performances. It allows for placing virtual content anywhere on large stages, even overlapping with physical elements and provides a consistent rendering of this content for large numbers of spectators. It also preserves non-verbal communication between the audience and the performers, and is inherently engaging for the spectators. We finally show that Reflets opens musical performance opportunities such as augmented interaction between musicians and novel techniques for 3D sound shapes manipulation.},
 address = {Baton Rouge, Louisiana, USA},
 author = {Florent Berthaut and Diego Martinez and Martin Hachet and Sriram Subramanian},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1179028},
 editor = {Edgar Berdahl and Jesse Allison},
 issn = {2220-4806},
 month = {May},
 pages = {116--120},
 publisher = {Louisiana State University},
 title = {Reflets: Combining and Revealing Spaces for Musical Performances},
 url = {http://www.nime.org/proceedings/2015/nime2015_190.pdf},
 urlsuppl1 = {http://www.nime.org/proceedings/2015/190/0190-file1.mp4},
 year = {2015}
}