Sonicolour: Exploring Colour Control of Sound Synthesis with Interactive Machine Learning
Tug F. O'Flaherty, Luigi Marino, Charalampos Saitis, and Anna Xambó Sedó
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2025
- Location: Canberra, Australia
- Track: Paper
- Pages: 462–467
- Article Number: 67
- DOI: 10.5281/zenodo.15698928 (Link to paper and supplementary files)
- PDF Link
Abstract
This paper explores crossmodal mappings of colour to sound. The instrument presented analyses the colour of physical objects via a colour light-to-frequency sensor and maps the corresponding red, green, and blue data values to parameters of a synthesiser. Interactive machine learning is used to facilitate the discovery of new relationships between sound and colour. The role of interactive machine learning is to find unexpected relationships between the visual features of the objects and the sound synthesis. The performance is evaluated by its ability to provide the user with a playful interaction between the visual and tactile exploration of coloured objects, and the generation of synthetic sounds. We conclude by outlining the potential of this approach for musical interaction design and music performance.
Citation
Tug F. O'Flaherty, Luigi Marino, Charalampos Saitis, and Anna Xambó Sedó. 2025. Sonicolour: Exploring Colour Control of Sound Synthesis with Interactive Machine Learning. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.15698928 [PDF]
BibTeX Entry
@article{nime2025_67, abstract = {This paper explores crossmodal mappings of colour to sound. The instrument presented analyses the colour of physical objects via a colour light-to-frequency sensor and maps the corresponding red, green, and blue data values to parameters of a synthesiser. Interactive machine learning is used to facilitate the discovery of new relationships between sound and colour. The role of interactive machine learning is to find unexpected relationships between the visual features of the objects and the sound synthesis. The performance is evaluated by its ability to provide the user with a playful interaction between the visual and tactile exploration of coloured objects, and the generation of synthetic sounds. We conclude by outlining the potential of this approach for musical interaction design and music performance.}, address = {Canberra, Australia}, articleno = {67}, author = {Tug F. O'Flaherty and Luigi Marino and Charalampos Saitis and Anna Xambó Sedó}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.15698928}, editor = {Doga Cavdir and Florent Berthaut}, issn = {2220-4806}, month = {June}, numpages = {6}, pages = {462--467}, title = {Sonicolour: Exploring Colour Control of Sound Synthesis with Interactive Machine Learning}, track = {Paper}, url = {http://nime.org/proceedings/2025/nime2025_67.pdf}, year = {2025} }