Spire Muse: A Virtual Musical Partner for Creative Brainstorming

Notto J. W. Thelle, and Philippe Pasquier

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

We present Spire Muse, a co-creative musical agent that engages in different kinds of interactive behaviors. The software utilizes corpora of solo instrumental performances encoded as self-organized maps and outputs slices of the corpora as concatenated, remodeled audio sequences. Transitions between behaviors can be automated, and the interface enables the negotiation of these transitions through feedback buttons that signal approval, force reversions to previous behaviors, or request change. Musical responses are embedded in a pre-trained latent space, emergent in the interaction, and influenced through the weighting of rhythmic, spectral, harmonic, and melodic features. The training and run-time modules utilize a modified version of the MASOM agent architecture. Our model stimulates spontaneous creativity and reduces the need for the user to sustain analytical mind frames, thereby optimizing flow. The agent traverses a system autonomy axis ranging from reactive to proactive, which includes the behaviors of shadowing, mirroring, and coupling. A fourth behavior—negotiation—is emergent from the interface between agent and user. The synergy of corpora, interactive modes, and influences induces musical responses along a musical similarity axis from converging to diverging. We share preliminary observations from experiments with the agent and discuss design challenges and future prospects.

Citation:

Notto J. W. Thelle, and Philippe Pasquier. 2021. Spire Muse: A Virtual Musical Partner for Creative Brainstorming. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.21428/92fbeb44.84c0b364

BibTeX Entry:

  @inproceedings{NIME21_38,
 abstract = {We present Spire Muse, a co-creative musical agent that engages in different kinds of interactive behaviors. The software utilizes corpora of solo instrumental performances encoded as self-organized maps and outputs slices of the corpora as concatenated, remodeled audio sequences. Transitions between behaviors can be automated, and the interface enables the negotiation of these transitions through feedback buttons that signal approval, force reversions to previous behaviors, or request change. Musical responses are embedded in a pre-trained latent space, emergent in the interaction, and influenced through the weighting of rhythmic, spectral, harmonic, and melodic features. The training and run-time modules utilize a modified version of the MASOM agent architecture. Our model stimulates spontaneous creativity and reduces the need for the user to sustain analytical mind frames, thereby optimizing flow. The agent traverses a system autonomy axis ranging from reactive to proactive, which includes the behaviors of shadowing, mirroring, and coupling. A fourth behavior—negotiation—is emergent from the interface between agent and user. The synergy of corpora, interactive modes, and influences induces musical responses along a musical similarity axis from converging to diverging. We share preliminary observations from experiments with the agent and discuss design challenges and future prospects.},
 address = {Shanghai, China},
 articleno = {38},
 author = {Thelle, Notto J. W. and Pasquier, Philippe},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.21428/92fbeb44.84c0b364},
 issn = {2220-4806},
 month = {June},
 presentation-video = {https://youtu.be/4QMQNyoGfOs},
 title = {Spire Muse: A Virtual Musical Partner for Creative Brainstorming},
 url = {https://nime.pubpub.org/pub/wcj8sjee},
 year = {2021}
}