Learning Machine Knit Minput Intertextiles
Victor Shepardson, and Sophie Skach
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2024
- Location: Utrecht, Netherlands
- Track: Music
- Pages: 130–134
- Article Number: 37
- DOI: 10.5281/zenodo.15028093 (Link to paper and supplementary files)
- PDF Link
- Video
Abstract
What lies among these loops of copper, wool and steel? Skin, yarn and wire form a circuit, machine precision meets somatic sense, and tactility elaborates algorithmic entanglements. In this performance, e-textile pieces are patched into a no-input mixer, while player uses interactive machine learning tools to embroider sonic layers onto a repeated gesture. Analog audio signals from a no-input mixing board are routed through a piezoresistive mat which has been knitted from wool and steel yarns. The performer first develops a patch on the no-input mixing board to sound a two-handed pressing gesture on the mat. Then, the they alternate between performing the gesture and refining mappings between a machine listing algorithm and additional layers of sound using the anguilla interactive machine learning package.
Citation
Victor Shepardson, and Sophie Skach. 2024. Learning Machine Knit Minput Intertextiles. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.15028093
BibTeX Entry
@article{nime2024_music_37, abstract = {What lies among these loops of copper, wool and steel? Skin, yarn and wire form a circuit, machine precision meets somatic sense, and tactility elaborates algorithmic entanglements. In this performance, e-textile pieces are patched into a no-input mixer, while player uses interactive machine learning tools to embroider sonic layers onto a repeated gesture. Analog audio signals from a no-input mixing board are routed through a piezoresistive mat which has been knitted from wool and steel yarns. The performer first develops a patch on the no-input mixing board to sound a two-handed pressing gesture on the mat. Then, the they alternate between performing the gesture and refining mappings between a machine listing algorithm and additional layers of sound using the anguilla interactive machine learning package.}, address = {Utrecht, Netherlands}, articleno = {37}, author = {Victor Shepardson and Sophie Skach}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.15028093}, editor = {Laurel Smith Pardue and Palle Dahlstedt}, issn = {2220-4806}, month = {September}, numpages = {5}, pages = {130--134}, presentation-video = {https://youtu.be/VT8Ht0lf_F4}, title = {Learning Machine Knit Minput Intertextiles}, track = {Music}, url = {http://nime.org/proceedings/2024/nime2024_music_37.pdf}, year = {2024} }