colligation

James Dooley

Music Proceedings of the International Conference on New Interfaces for Musical Expression

  • Year: 2019
  • Location: Porto Alegre, Brazil
  • Pages: 15-16
  • PDF link

Abstract:

colligation (to bring or tie together) is a physical performance work for one performer that explores the idea of sculpting sound through gesture. Treating sound as if it were a tangible object capable of being fashioned into new sonic forms, "pieces" of sound are captured, shaped and sculpted by the performer's hand and arm gestures, appearing pliable as they are thrown around and transformed into new sonic material. colligation uses two Thalmic Labs Myo armbands, one placed on the left arm and the other on the right arm. The Myo Mapper [1] software is used to transmit scaled data via OSC from the armbands to Pure Data. Positional (yaw, pitch and roll) and electromyographic data (EMG) from the devices are mapped to parameters controlling a hybrid synth created in Pure Data. The synth utilises a combination of Phase Aligned Formant synthesis [2] and Frequency Modulation synthesis [3] to allow a range of complex audio spectra to be explored. Pitch, yaw and roll data from the left Myo are respectively mapped to the PAF synth's carrier frequency (ranging from 8.175-12543.9Hz), bandwidth and relative centre frequency. Pitch, yaw and roll data from the right Myo are respectively mapped to FM modulation frequency (relative to and ranging from 0.01-10 times the PAF carrier frequency), modulation depth (relative to and ranging from 0.01-10 times the PAF carrier frequency), and modulation wave shape (crossfading between sine, triangle, square, rising sawtooth and impulse). Data from the left and right Myo's EMG sensors are mapped respectively to amplitude control of the left and right audio channels, giving the performer control over the level and panning of the audio within the stereo field. By employing both positional and bio data, an embodied relationship between action and response is created; the gesture and the resulting sonic transformation become inextricably entwined.

Citation:

James Dooley. 2019. colligation. Music Proceedings of the International Conference on New Interfaces for Musical Expression. DOI:

BibTeX Entry:

  @inproceedings{nime19-music-Dooley,
 abstract = {colligation (to bring or tie together) is a physical performance work for one performer that explores the idea of sculpting sound through gesture. Treating sound as if it were a tangible object capable of being fashioned into new sonic forms, "pieces" of sound are captured, shaped and sculpted by the performer's hand and arm gestures, appearing pliable as they are thrown around and transformed into new sonic material. colligation uses two Thalmic Labs Myo armbands, one placed on the left arm and the other on the right arm. The Myo Mapper [1] software is used to transmit scaled data via OSC from the armbands to Pure Data. Positional (yaw, pitch and roll) and electromyographic data (EMG) from the devices are mapped to parameters controlling a hybrid synth created in Pure Data. The synth utilises a combination of Phase Aligned Formant synthesis [2] and Frequency Modulation synthesis [3] to allow a range of complex audio spectra to be explored. Pitch, yaw and roll data from the left Myo are respectively mapped to the PAF synth's carrier frequency (ranging from 8.175-12543.9Hz), bandwidth and relative centre frequency. Pitch, yaw and roll data from the right Myo are respectively mapped to FM modulation frequency (relative to and ranging from 0.01-10 times the PAF carrier frequency), modulation depth (relative to and ranging from 0.01-10 times the PAF carrier frequency), and modulation wave shape (crossfading between sine, triangle, square, rising sawtooth and impulse). Data from the left and right Myo's EMG sensors are mapped respectively to amplitude control of the left and right audio channels, giving the performer control over the level and panning of the audio within the stereo field. By employing both positional and bio data, an embodied relationship between action and response is created; the gesture and the resulting sonic transformation become inextricably entwined.},
 address = {Porto Alegre, Brazil},
 author = {James Dooley},
 booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression},
 editor = {Federico Visi},
 month = {June},
 pages = {15-16},
 publisher = {UFRGS},
 title = {colligation},
 url = {http://www.nime.org/proceedings/2019/nime2019_music003.pdf},
 year = {2019}
}