Creating Musical Expression using Kinect

Min-Joon Yoo, Jin-Wook Beak, and In-Kwon Lee

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

Recently, Microsoft introduced a game interface called Kinect for the Xbox 360 video game platform. This interface enables users to control and interact with the game console without the need to touch a controller. It largely increases the users' degree of freedom to express their emotion. In this paper, we first describe the system we developed to use this interface for sound generation and controlling musical expression. The skeleton data are extracted from users' motions and the data are translated to pre-defined MIDI data. We then use the MIDI data to control several applications. To allow the translation between the data, we implemented a simple Kinect-to-MIDI data convertor, which is introduced in this paper. We describe two applications to make music with Kinect: we first generate sound with Max/MSP, and then control the adlib with our own adlib generating system by the body movements of the users.

Citation:

Min-Joon Yoo, Jin-Wook Beak, and In-Kwon Lee. 2011. Creating Musical Expression using Kinect. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178193

BibTeX Entry:

  @inproceedings{Yoo2011,
 abstract = {Recently, Microsoft introduced a game interface called Kinect for the Xbox 360 video game platform. This interface enables users to control and interact with the game console without the need to touch a controller. It largely increases the users' degree of freedom to express their emotion. In this paper, we first describe the system we developed to use this interface for sound generation and controlling musical expression. The skeleton data are extracted from users' motions and the data are translated to pre-defined MIDI data. We then use the MIDI data to control several applications. To allow the translation between the data, we implemented a simple Kinect-to-MIDI data convertor, which is introduced in this paper. We describe two applications to make music with Kinect: we first generate sound with Max/MSP, and then control the adlib with our own adlib generating system by the body movements of the users. },
 address = {Oslo, Norway},
 author = {Yoo, Min-Joon and Beak, Jin-Wook and Lee, In-Kwon},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178193},
 issn = {2220-4806},
 keywords = {Kinect, gaming interface, sound generation, adlib generation },
 pages = {324--325},
 title = {Creating Musical Expression using Kinect},
 url = {http://www.nime.org/proceedings/2011/nime2011_324.pdf},
 year = {2011}
}