A Framework for Coordination and Synchronization of Media

Dawen Liang, Guangyu Xia, and Roger B. Dannenberg

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

Computer music systems that coordinate or interact with human musicians exist in many forms. Often, coordination is at the level of gestures and phrases without synchronization at the beat level (or perhaps the notion of "beat" does not even exist). In music with beats, fine-grain synchronization can be achieved by having humans adapt to the computer (e.g. following a click track), or by computer accompaniment in which the computer follows a predetermined score. We consider an alternative scenario in which improvisation prevents traditional score following, but where synchronization is achieved at the level of beats, measures, and cues. To explore this new type of human-computer interaction, we have created new software abstractions for synchronization and coordination of music and interfaces in different modalities. We describe these new software structures, present examples, and introduce the idea of music notation as an interactive musical interface rather than a static document.

Citation:

Dawen Liang, Guangyu Xia, and Roger B. Dannenberg. 2011. A Framework for Coordination and Synchronization of Media. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178091

BibTeX Entry:

  @inproceedings{Liang2011,
 abstract = {Computer music systems that coordinate or interact with human musicians exist in many forms. Often, coordination is at the level of gestures and phrases without synchronization at the beat level (or perhaps the notion of "beat" does not even exist). In music with beats, fine-grain synchronization can be achieved by having humans adapt to the computer (e.g. following a click track), or by computer accompaniment in which the computer follows a predetermined score. We consider an alternative scenario in which improvisation prevents traditional score following, but where synchronization is achieved at the level of beats, measures, and cues. To explore this new type of human-computer interaction, we have created new software abstractions for synchronization and coordination of music and interfaces in different modalities. We describe these new software structures, present examples, and introduce the idea of music notation as an interactive musical interface rather than a static document. },
 address = {Oslo, Norway},
 author = {Liang, Dawen and Xia, Guangyu and Dannenberg, Roger B.},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178091},
 issn = {2220-4806},
 keywords = {automatic accompaniment,interactive,music display,popular music,real-time,synchronization},
 pages = {167--172},
 presentation-video = {https://vimeo.com/26832515/},
 title = {A Framework for Coordination and Synchronization of Media},
 url = {http://www.nime.org/proceedings/2011/nime2011_167.pdf},
 year = {2011}
}