PlayTrainPlay
Mark Hanslip
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2024
- Location: Utrecht, Netherlands
- Track: Music
- Pages: 135–137
- Article Number: 38
- DOI: 10.5281/zenodo.15028095 (Link to paper and supplementary files)
- PDF Link
- Video
Abstract
‘PlayTrainPlay’ is a structured solo human-computer improvisation in two parts. It combines instrumental improvisation with granular synthesis, machine listening and neural networks. Through its structure, PlayTrainPlay describes the process of collecting data, training a neural network and interacting with the resulting model through musical performance; its structure can also be said to represent the transition from a physical effects interface to a data-based one. These concepts are additionally communicated through the projection of live visuals consisting of code outputs, performance footage and abstract audio-reactive images. Section 1 employs a real-time granular effect whose parameters are manipulated via a foot controller. Over the course of this section, analysis of the saxophone's timbre and corresponding effect parameters are logged before the performer triggers training of the neural network. In Section 2, the neural network controls the effect, which is deployed in a loop that the performer interacts with. This work motivates further use of data model-based interfaces for instrumental effects by instrumentalists themselves. It proposes the use of data collection and modelling during performance as a means to create unique situation-based interactions. Lastly, it promotes the use of neural networks as a way of simplifying musicians' workflows.
Citation
Mark Hanslip. 2024. PlayTrainPlay. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.15028095
BibTeX Entry
@article{nime2024_music_38, abstract = {‘PlayTrainPlay’ is a structured solo human-computer improvisation in two parts. It combines instrumental improvisation with granular synthesis, machine listening and neural networks. Through its structure, PlayTrainPlay describes the process of collecting data, training a neural network and interacting with the resulting model through musical performance; its structure can also be said to represent the transition from a physical effects interface to a data-based one. These concepts are additionally communicated through the projection of live visuals consisting of code outputs, performance footage and abstract audio-reactive images. Section 1 employs a real-time granular effect whose parameters are manipulated via a foot controller. Over the course of this section, analysis of the saxophone's timbre and corresponding effect parameters are logged before the performer triggers training of the neural network. In Section 2, the neural network controls the effect, which is deployed in a loop that the performer interacts with. This work motivates further use of data model-based interfaces for instrumental effects by instrumentalists themselves. It proposes the use of data collection and modelling during performance as a means to create unique situation-based interactions. Lastly, it promotes the use of neural networks as a way of simplifying musicians' workflows.}, address = {Utrecht, Netherlands}, articleno = {38}, author = {Mark Hanslip}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.15028095}, editor = {Laurel Smith Pardue and Palle Dahlstedt}, issn = {2220-4806}, month = {September}, numpages = {3}, pages = {135--137}, presentation-video = {}, title = {PlayTrainPlay}, track = {Music}, url = {http://nime.org/proceedings/2024/nime2024_music_38.pdf}, year = {2024} }