A Machine Learning Toolbox For Musician Computer Interaction
Nicholas Gillian, Benjamin Knapp, and Sile O'Modhrain
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2011
- Location: Oslo, Norway
- Pages: 343–348
- Keywords: Machine learning, gesture recognition, musician-computer interaction, SEC
- DOI: 10.5281/zenodo.1178031 (Link to paper)
- PDF link
- Presentation Video
Abstract
This paper presents the SARC EyesWeb Catalog, (SEC),a machine learning toolbox that has been specifically developed for musician-computer interaction. The SEC features a large number of machine learning algorithms that can be used in real-time to recognise static postures, perform regression and classify multivariate temporal gestures. The algorithms within the toolbox have been designed to work with any N -dimensional signal and can be quickly trained with a small number of training examples. We also provide the motivation for the algorithms used for the recognition of musical gestures to achieve a low intra-personal generalisation error, as opposed to the inter-personal generalisation error that is more common in other areas of human-computer interaction.
Citation
Nicholas Gillian, Benjamin Knapp, and Sile O'Modhrain. 2011. A Machine Learning Toolbox For Musician Computer Interaction. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178031
BibTeX Entry
@inproceedings{Gillian2011a, abstract = {This paper presents the SARC EyesWeb Catalog, (SEC),a machine learning toolbox that has been specifically developed for musician-computer interaction. The SEC features a large number of machine learning algorithms that can be used in real-time to recognise static postures, perform regression and classify multivariate temporal gestures. The algorithms within the toolbox have been designed to work with any N -dimensional signal and can be quickly trained with a small number of training examples. We also provide the motivation for the algorithms used for the recognition of musical gestures to achieve a low intra-personal generalisation error, as opposed to the inter-personal generalisation error that is more common in other areas of human-computer interaction.}, address = {Oslo, Norway}, author = {Gillian, Nicholas and Knapp, Benjamin and O'Modhrain, Sile}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1178031}, issn = {2220-4806}, keywords = {Machine learning, gesture recognition, musician-computer interaction, SEC }, pages = {343--348}, presentation-video = {https://vimeo.com/26872843/}, title = {A Machine Learning Toolbox For Musician Computer Interaction}, url = {http://www.nime.org/proceedings/2011/nime2011_343.pdf}, year = {2011} }