Adaptive Multimodal Music Learning via Interactive Haptic Instrument

Yian Zhang, Yinmiao Li, Daniel Chin, and Gus Xia

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

Haptic interfaces have untapped the sense of touch to assist multimodal music learning. We have recently seen various improvements of interface design on tactile feedback and force guidance aiming to make instrument learning more effective. However, most interfaces are still quite static; they cannot yet sense the learning progress and adjust the tutoring strategy accordingly. To solve this problem, we contribute an adaptive haptic interface based on the latest design of haptic flute. We first adopted a clutch mechanism to enable the interface to turn on and off the haptic control flexibly in real time. The interactive tutor is then able to follow human performances and apply the “teacher force” only when the software instructs so. Finally, we incorporated the adaptive interface with a step-by-step dynamic learning strategy. Experimental results showed that dynamic learning dramatically outperforms static learning, which boosts the learning rate by 45.3% and shrinks the forgetting chance by 86%.

Citation:

Yian Zhang, Yinmiao Li, Daniel Chin, and Gus Xia. 2019. Adaptive Multimodal Music Learning via Interactive Haptic Instrument. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.3672900

BibTeX Entry:

  @inproceedings{Zhang2019,
 abstract = {Haptic interfaces have untapped the sense of touch to assist multimodal music learning. We have recently seen various improvements of interface design on tactile feedback and force guidance aiming to make instrument learning more effective. However, most interfaces are still quite static; they cannot yet sense the learning progress and adjust the tutoring strategy accordingly. To solve this problem, we contribute an adaptive haptic interface based on the latest design of haptic flute. We first adopted a clutch mechanism to enable the interface to turn on and off the haptic control flexibly in real time. The interactive tutor is then able to follow human performances and apply the “teacher force” only when the software instructs so. Finally, we incorporated the adaptive interface with a step-by-step dynamic learning strategy. Experimental results showed that dynamic learning dramatically outperforms static learning, which boosts the learning rate by 45.3% and shrinks the forgetting chance by 86%.},
 address = {Porto Alegre, Brazil},
 author = {Yian Zhang and Yinmiao Li and Daniel Chin and Gus Xia},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.3672900},
 editor = {Marcelo Queiroz and Anna Xambó Sedó},
 issn = {2220-4806},
 month = {June},
 pages = {140--145},
 publisher = {UFRGS},
 title = {Adaptive Multimodal Music Learning via Interactive Haptic Instrument},
 url = {http://www.nime.org/proceedings/2019/nime2019_paper028.pdf},
 year = {2019}
}