Hand and Finger Motion-Controlled Audio Mixing Interface

Jarrod Ratcliffe

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

This paper presents a control surface interface for music mixing using real time computer vision. Two input sensors are considered: the Leap Motion and the Microsoft Kinect. The author presents significant design considerations, including improving of the user's sense of depth and panorama, maintaining broad accessibility by integrating the system with Digital Audio Workstation (DAW) software, and implementing a system that is portable and affordable. To provide the user with a heightened sense of sound spatialization over the traditional channel strip, the concept of depth is addressed directly using the stage metaphor. Sound sources are represented as colored spheres in a graphical user interface to provide the user with visual feedback. Moving sources back and forward controls volume, while left to right controls panning. To provide broader accessibility, the interface is configured to control mixing within the Ableton Live DAW. The author also discusses future plans to expand functionality and evaluate the system.

Citation:

Jarrod Ratcliffe. 2014. Hand and Finger Motion-Controlled Audio Mixing Interface. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178911

BibTeX Entry:

  @inproceedings{jratcliffe2014,
 abstract = {This paper presents a control surface interface for music mixing using real time computer vision. Two input sensors are considered: the Leap Motion and the Microsoft Kinect. The author presents significant design considerations, including improving of the user's sense of depth and panorama, maintaining broad accessibility by integrating the system with Digital Audio Workstation (DAW) software, and implementing a system that is portable and affordable. To provide the user with a heightened sense of sound spatialization over the traditional channel strip, the concept of depth is addressed directly using the stage metaphor. Sound sources are represented as colored spheres in a graphical user interface to provide the user with visual feedback. Moving sources back and forward controls volume, while left to right controls panning. To provide broader accessibility, the interface is configured to control mixing within the Ableton Live DAW. The author also discusses future plans to expand functionality and evaluate the system.},
 address = {London, United Kingdom},
 author = {Jarrod Ratcliffe},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178911},
 issn = {2220-4806},
 month = {June},
 pages = {136--139},
 publisher = {Goldsmiths, University of London},
 title = {Hand and Finger Motion-Controlled Audio Mixing Interface},
 url = {http://www.nime.org/proceedings/2014/nime2014_518.pdf},
 year = {2014}
}