SketchSynth: a browser-based sketching interface for sound control

Sebastian Lobbers, and George Fazekas

Proceedings of the International Conference on New Interfaces for Musical Expression

  • Year: 2023
  • Location: Mexico City, Mexico
  • Track: Demos
  • Pages: 637–641
  • Article Number: 95
  • PDF link

Abstract:

SketchSynth is an interface that allows users to create mappings between synthesised sound and a graphical sketch input based on human cross-modal perception. The project is rooted in the authors' research which collected 2692 sound-sketches from 178 participants representing their associations with various sounds. The interface extracts sketch features in real-time that were shown to correlate with sound characteristics and can be mapped to synthesis and audio effect parameters via Open Sound Control (OSC). This modular approach allows for an easy integration into an existing workflow and can be tailored to individual preferences. The interface can be accessed online through a web-browser on a computer, laptop, smartphone or tablet and does not require specialised hard- or software. We demonstrate SketchSynth with an iPad for sketch input to control synthesis and audio effect parameters in the Ableton Live digital audio workstation (DAW). A MIDI controller is used to play notes and trigger pre-recorded accompaniment. This work serves as an example of how perceptual research can help create strong, meaningful gesture-to-sound mappings.

Citation:

Sebastian Lobbers, and George Fazekas. 2023. SketchSynth: a browser-based sketching interface for sound control. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI:

BibTeX Entry:

  @article{nime2023_95,
 abstract = {SketchSynth is an interface that allows users to create mappings between synthesised sound and a graphical sketch input based on human cross-modal perception. The project is rooted in the authors' research which collected 2692 sound-sketches from 178 participants representing their associations with various sounds. The interface extracts sketch features in real-time that were shown to correlate with sound characteristics and can be mapped to synthesis and audio effect parameters via Open Sound Control (OSC). This modular approach allows for an easy integration into an existing workflow and can be tailored to individual preferences. The interface can be accessed online through a web-browser on a computer, laptop, smartphone or tablet and does not require specialised hard- or software. We demonstrate SketchSynth with an iPad for sketch input to control synthesis and audio effect parameters in the Ableton Live digital audio workstation (DAW). A MIDI controller is used to play notes and trigger pre-recorded accompaniment. This work serves as an example of how perceptual research can help create strong, meaningful gesture-to-sound mappings.},
 address = {Mexico City, Mexico},
 articleno = {95},
 author = {Sebastian Lobbers and George Fazekas},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 editor = {Miguel Ortiz and Adnan Marquez-Borbon},
 issn = {2220-4806},
 month = {May},
 numpages = {5},
 pages = {637--641},
 title = {SketchSynth: a browser-based sketching interface for sound control},
 track = {Demos},
 url = {http://nime.org/proceedings/2023/nime2023_95.pdf},
 year = {2023}
}