Comparative Latency Analysis of Optical and Inertial Motion Capture Systems for Gestural Analysis and Musical Performance
Geise Santos, Johnty Wang, Carolina Brum, Marcelo M. Wanderley, Tiago Tavares, and Anderson Rocha
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2021
- Location: Shanghai, China
- Article Number: 51
- DOI: 10.21428/92fbeb44.51b1c3a1 (Link to paper)
- PDF link
- Presentation Video
Abstract:
Wireless sensor-based technologies are becoming increasingly accessible and widely explored in interactive musical performance due to their ubiquity and low-cost, which brings the necessity of understanding the capabilities and limitations of these sensors. This is usually approached by using a reference system, such as an optical motion capture system, to assess the signals’ properties. However, this process raises the issue of synchronizing the signal and the reference data streams, as each sensor is subject to different latency, time drift, reference clocks and initialization timings. This paper presents an empirical quantification of the latency communication stages in a setup consisting of a Qualisys optical motion capture (mocap) system and a wireless microcontroller-based sensor device. We performed event-to-end tests on the critical components of the hybrid setup to determine the synchronization suitability. Overall, further synchronization is viable because of the near individual average latencies of around 25ms for both the mocap system and the wireless sensor interface.
Citation:
Geise Santos, Johnty Wang, Carolina Brum, Marcelo M. Wanderley, Tiago Tavares, and Anderson Rocha. 2021. Comparative Latency Analysis of Optical and Inertial Motion Capture Systems for Gestural Analysis and Musical Performance. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.21428/92fbeb44.51b1c3a1BibTeX Entry:
@inproceedings{NIME21_51, abstract = {Wireless sensor-based technologies are becoming increasingly accessible and widely explored in interactive musical performance due to their ubiquity and low-cost, which brings the necessity of understanding the capabilities and limitations of these sensors. This is usually approached by using a reference system, such as an optical motion capture system, to assess the signals’ properties. However, this process raises the issue of synchronizing the signal and the reference data streams, as each sensor is subject to different latency, time drift, reference clocks and initialization timings. This paper presents an empirical quantification of the latency communication stages in a setup consisting of a Qualisys optical motion capture (mocap) system and a wireless microcontroller-based sensor device. We performed event-to-end tests on the critical components of the hybrid setup to determine the synchronization suitability. Overall, further synchronization is viable because of the near individual average latencies of around 25ms for both the mocap system and the wireless sensor interface.}, address = {Shanghai, China}, articleno = {51}, author = {Santos, Geise and Wang, Johnty and Brum, Carolina and Wanderley, Marcelo M. and Tavares, Tiago and Rocha, Anderson}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.21428/92fbeb44.51b1c3a1}, issn = {2220-4806}, month = {June}, presentation-video = {https://youtu.be/a1TVvr9F7hE}, title = {Comparative Latency Analysis of Optical and Inertial Motion Capture Systems for Gestural Analysis and Musical Performance}, url = {https://nime.pubpub.org/pub/wmcqkvw1}, year = {2021} }