EyeMusic : Performing Live Music and Multimedia Compositions with Eye Movements
Anthony J. Hornof, Troy Rogers, and Tim Halverson
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2007
- Location: New York City, NY, United States
- Pages: 299–300
- Keywords: H.5.2 [Information Interfaces and Presentation] User Interfaces --- input devices and strategies, interaction styles. J.5 [Arts and Humanities] Fine arts, performing arts.
- DOI: 10.5281/zenodo.1177121 (Link to paper)
- PDF link
Abstract:
In this project, eye tracking researchers and computer music composers collaborate to create musical compositions that are played with the eyes. A commercial eye tracker (LC Technologies Eyegaze) is connected to a music and multimedia authoring environment (Max/MSP/Jitter). The project addresses issues of both noise and control: How will the performance benefit from the noise inherent in eye trackers and eye movements, and to what extent should the composition encourage the performer to try to control a specific musical outcome? Providing one set of answers to these two questions, the authors create an eye-controlled composition, EyeMusic v1.0, which was selected by juries for live performance at computer music conferences.
Citation:
Anthony J. Hornof, Troy Rogers, and Tim Halverson. 2007. EyeMusic : Performing Live Music and Multimedia Compositions with Eye Movements. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1177121BibTeX Entry:
@inproceedings{Hornof2007, abstract = {In this project, eye tracking researchers and computer music composers collaborate to create musical compositions that are played with the eyes. A commercial eye tracker (LC Technologies Eyegaze) is connected to a music and multimedia authoring environment (Max/MSP/Jitter). The project addresses issues of both noise and control: How will the performance benefit from the noise inherent in eye trackers and eye movements, and to what extent should the composition encourage the performer to try to control a specific musical outcome? Providing one set of answers to these two questions, the authors create an eye-controlled composition, EyeMusic v1.0, which was selected by juries for live performance at computer music conferences.}, address = {New York City, NY, United States}, author = {Hornof, Anthony J. and Rogers, Troy and Halverson, Tim}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1177121}, issn = {2220-4806}, keywords = {H.5.2 [Information Interfaces and Presentation] User Interfaces --- input devices and strategies, interaction styles. J.5 [Arts and Humanities] Fine arts, performing arts. }, pages = {299--300}, title = {EyeMusic : Performing Live Music and Multimedia Compositions with Eye Movements}, url = {http://www.nime.org/proceedings/2007/nime2007_299.pdf}, year = {2007} }