eMersion | Sensor-controlled Electronic Music Modules & Digital Data Workstation

Chet Udell, and James Paul Sain

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

In our current era, where smartphones are commonplace and buzzwords like ``the internet of things,'' ``wearable tech,'' and ``augmented reality'' are ubiquitous, translating performance gestures into data and intuitively mapping it to control musical/visual parameters in the realm of computing should be trivial; but it isn't. Technical barriers still persist that limit this activity to exclusive groups capable of learning skillsets far removed from one's musical craft. These skills include programming, soldering, microprocessors, wireless protocols, and circuit design. Those of us whose creative activity is centered in NIME have to become polyglots of many disciplines to achieve our work. In the NIME community, it's unclear that we should even draw distinctions between 'artist' and 'technician', because these skillsets have become integral to our creative practice. However, what about the vast communities of musicians, composers, and artists who want to leverage sensing to take their craft into new territory with no background in circuits, soldering, embedded programming, and sensor function? eMersion, a plug-and-play, modular, wireless alternative solution for creating NIMEs will be presented. It enables one to bypass the technical hurdles listed above in favor of immediate experimentation with musical practice and wireless sensing. A unique software architecture will also be unveiled that enables one to quickly and intuitively process and map unpredictable numbers and types of wireless data streams, the Digital Data Workstation.

Citation:

Chet Udell, and James Paul Sain. 2014. eMersion | Sensor-controlled Electronic Music Modules & Digital Data Workstation. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178971

BibTeX Entry:

  @inproceedings{cudell2014,
 abstract = {In our current era, where smartphones are commonplace and buzzwords like ``the internet of things,'' ``wearable tech,'' and ``augmented reality'' are ubiquitous, translating performance gestures into data and intuitively mapping it to control musical/visual parameters in the realm of computing should be trivial; but it isn't. Technical barriers still persist that limit this activity to exclusive groups capable of learning skillsets far removed from one's musical craft. These skills include programming, soldering, microprocessors, wireless protocols, and circuit design. Those of us whose creative activity is centered in NIME have to become polyglots of many disciplines to achieve our work. In the NIME community, it's unclear that we should even draw distinctions between 'artist' and 'technician', because these skillsets have become integral to our creative practice. However, what about the vast communities of musicians, composers, and artists who want to leverage sensing to take their craft into new territory with no background in circuits, soldering, embedded programming, and sensor function? eMersion, a plug-and-play, modular, wireless alternative solution for creating NIMEs will be presented. It enables one to bypass the technical hurdles listed above in favor of immediate experimentation with musical practice and wireless sensing. A unique software architecture will also be unveiled that enables one to quickly and intuitively process and map unpredictable numbers and types of wireless data streams, the Digital Data Workstation.},
 address = {London, United Kingdom},
 author = {Chet Udell and James Paul Sain},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1178971},
 issn = {2220-4806},
 month = {June},
 pages = {130--133},
 publisher = {Goldsmiths, University of London},
 title = {eMersion | Sensor-controlled Electronic Music Modules \& Digital Data Workstation},
 url = {http://www.nime.org/proceedings/2014/nime2014_272.pdf},
 year = {2014}
}