Sonified Motion Flow Fields as a Means of Musical Expression

Jean-Marc Pelletier

Proceedings of the International Conference on New Interfaces for Musical Expression

  • Year: 2008
  • Location: Genoa, Italy
  • Pages: 158–163
  • Keywords: Computer vision, control field, image analysis, imaging, mapping, microsound, motion flow, sonification, synthesis
  • DOI: 10.5281/zenodo.1179611 (Link to paper)
  • PDF link

Abstract:

This paper describes a generalized motion-based framework forthe generation of large musical control fields from imaging data.The framework is general in the sense that it does not depend ona particular source of sensing data. Real-time images of stageperformers, pre-recorded and live video, as well as more exoticdata from imaging systems such as thermography, pressuresensor arrays, etc. can be used as a source of control. Featurepoints are extracted from the candidate images, from whichmotion vector fields are calculated. After some processing, thesemotion vectors are mapped individually to sound synthesisparameters. Suitable synthesis techniques include granular andmicrosonic algorithms, additive synthesis and micro-polyphonicorchestration. Implementation details of this framework isdiscussed, as well as suitable creative and artistic uses andapproaches.

Citation:

Jean-Marc Pelletier. 2008. Sonified Motion Flow Fields as a Means of Musical Expression. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1179611

BibTeX Entry:

  @inproceedings{Pelletier2008,
 abstract = {This paper describes a generalized motion-based framework forthe generation of large musical control fields from imaging data.The framework is general in the sense that it does not depend ona particular source of sensing data. Real-time images of stageperformers, pre-recorded and live video, as well as more exoticdata from imaging systems such as thermography, pressuresensor arrays, etc. can be used as a source of control. Featurepoints are extracted from the candidate images, from whichmotion vector fields are calculated. After some processing, thesemotion vectors are mapped individually to sound synthesisparameters. Suitable synthesis techniques include granular andmicrosonic algorithms, additive synthesis and micro-polyphonicorchestration. Implementation details of this framework isdiscussed, as well as suitable creative and artistic uses andapproaches.},
 address = {Genoa, Italy},
 author = {Pelletier, Jean-Marc},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1179611},
 issn = {2220-4806},
 keywords = {Computer vision, control field, image analysis, imaging, mapping, microsound, motion flow, sonification, synthesis },
 pages = {158--163},
 title = {Sonified Motion Flow Fields as a Means of Musical Expression},
 url = {http://www.nime.org/proceedings/2008/nime2008_158.pdf},
 year = {2008}
}