conga: A Framework for Adaptive Conducting Gesture Analysis

Eric Lee, Ingo Grüll, Henning Keil, and Jan Borchers

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

Designing a conducting gesture analysis system for public spacesposes unique challenges. We present conga, a software framework that enables automatic recognition and interpretation ofconducting gestures. conga is able to recognize multiple types ofgestures with varying levels of difficulty for the user to perform,from a standard four-beat pattern, to simplified up-down conducting movements, to no pattern at all. conga provides an extendablelibrary of feature detectors linked together into a directed acyclicgraph; these graphs represent the various conducting patterns asgesture profiles. At run-time, conga searches for the best profileto match a user's gestures in real-time, and uses a beat prediction algorithm to provide results at the sub-beat level, in additionto output values such as tempo, gesture size, and the gesture'sgeometric center. Unlike some previous approaches, conga doesnot need to be trained with sample data before use. Our preliminary user tests show that conga has a beat recognition rate ofover 90%. conga is deployed as the gesture recognition systemfor Maestro!, an interactive conducting exhibit that opened in theBetty Brinn Children's Museum in Milwaukee, USA in March2006.

Citation:

Eric Lee, Ingo Grüll, Henning Keil, and Jan Borchers. 2006. conga: A Framework for Adaptive Conducting Gesture Analysis. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1176957

BibTeX Entry:

  @inproceedings{Lee2006,
 abstract = {Designing a conducting gesture analysis system for public spacesposes unique challenges. We present conga, a software framework that enables automatic recognition and interpretation ofconducting gestures. conga is able to recognize multiple types ofgestures with varying levels of difficulty for the user to perform,from a standard four-beat pattern, to simplified up-down conducting movements, to no pattern at all. conga provides an extendablelibrary of feature detectors linked together into a directed acyclicgraph; these graphs represent the various conducting patterns asgesture profiles. At run-time, conga searches for the best profileto match a user's gestures in real-time, and uses a beat prediction algorithm to provide results at the sub-beat level, in additionto output values such as tempo, gesture size, and the gesture'sgeometric center. Unlike some previous approaches, conga doesnot need to be trained with sample data before use. Our preliminary user tests show that conga has a beat recognition rate ofover 90%. conga is deployed as the gesture recognition systemfor Maestro!, an interactive conducting exhibit that opened in theBetty Brinn Children's Museum in Milwaukee, USA in March2006.},
 address = {Paris, France},
 author = {Lee, Eric and Gr\''{u}ll, Ingo and Keil, Henning and Borchers, Jan},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1176957},
 issn = {2220-4806},
 keywords = {gesture recognition, conducting, software gesture frameworks },
 pages = {260--265},
 title = {conga: A Framework for Adaptive Conducting Gesture Analysis},
 url = {http://www.nime.org/proceedings/2006/nime2006_260.pdf},
 year = {2006}
}