Managing Gesture and Timbre for Analysis and Instrument Control in an Interactive Environment

William Hsu

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract

This paper describes recent enhancements in an interactive system designed to improvise with saxophonist John Butcher [1]. In addition to musical parameters such as pitch and loudness, our system is able to analyze timbral characteristics of the saxophone tone in real-time, and use timbral information to guide the generation of response material. We capture each saxophone gesture on the fly, extract a set of gestural and timbral contours, and store them in a repository. Improvising agents can consult the repository when generating responses. The gestural or timbral progression of a saxophone phrase can be remapped or transformed; this enables a variety of response material that also references audible contours of the original saxophone gestures. A single simple framework is used to manage gestural and timbral information extracted from analysis, and for expressive control of virtual instruments in a free improvisation context.

Citation

William Hsu. 2006. Managing Gesture and Timbre for Analysis and Instrument Control in an Interactive Environment. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1176927

BibTeX Entry

@inproceedings{Hsu2006,
 abstract = {This paper describes recent enhancements in an interactive system designed to improvise with saxophonist John Butcher [1]. In addition to musical parameters such as pitch and loudness, our system is able to analyze timbral characteristics of the saxophone tone in real-time, and use timbral information to guide the generation of response material. We capture each saxophone gesture on the fly, extract a set of gestural and timbral contours, and store them in a repository. Improvising agents can consult the repository when generating responses. The gestural or timbral progression of a saxophone phrase can be remapped or transformed; this enables a variety of response material that also references audible contours of the original saxophone gestures. A single simple framework is used to manage gestural and timbral information extracted from analysis, and for expressive control of virtual instruments in a free improvisation context. },
 address = {Paris, France},
 author = {Hsu, William},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1176927},
 issn = {2220-4806},
 keywords = {Interactive music systems, timbre analysis, instrument control. },
 pages = {376--379},
 title = {Managing Gesture and Timbre for Analysis and Instrument Control in an Interactive Environment},
 url = {http://www.nime.org/proceedings/2006/nime2006_376.pdf},
 year = {2006}
}