Audio-Visual Feedback for Self-monitoring Posture in Ballet Training

Esben W. Knudsen, Malte L. Hølledig, Mads Juel Nielsen, Rikke K. Petersen, Sebastian Bach-Nielsen, Bogdan-Constantin Zanescu, Daniel Overholt, Hendrik Purwins, and Kim Helweg

Proceedings of the International Conference on New Interfaces for Musical Expression

Abstract:

An application for ballet training is presented that monitors the posture position (straightness of the spine and rotation of the pelvis) deviation from the ideal position in real-time. The human skeletal data is acquired through a Microsoft Kinect v2. The movement of the student is mirrored through an abstract skeletal figure and instructions are provided through a virtual teacher. Posture deviation is measured in the following way: Torso misalignment is calculated by comparing hip center joint, shoulder center joint and neck joint position with an ideal posture position retrieved in an initial calibration procedure. Pelvis deviation is expressed as the xz-rotation of the hip-center joint. The posture deviation is sonified via a varying cut-off frequency of a high-pass filter applied to floating water sound. The posture deviation is visualized via a curve and a rigged skeleton in which the misaligned torso parts are color-coded. In an experiment with 9-12 year-old dance students from a ballet school, comparing the audio-visual feedback modality with no feedback leads to an increase in posture accuracy (p < 0.001, Cohen's d = 1.047). Reaction card feedback and expert interviews indicate that the feedback is considered fun and useful for training independently from the teacher.

Citation:

Esben W. Knudsen, Malte L. Hølledig, Mads Juel Nielsen, Rikke K. Petersen, Sebastian Bach-Nielsen, Bogdan-Constantin Zanescu, Daniel Overholt, Hendrik Purwins, and Kim Helweg. 2017. Audio-Visual Feedback for Self-monitoring Posture in Ballet Training. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1181422

BibTeX Entry:

  @inproceedings{eknudsen2017,
 abstract = {An application for ballet training is presented that monitors the posture position (straightness of the spine and rotation of the pelvis) deviation from the ideal position in real-time. The human skeletal data is acquired through a Microsoft Kinect v2.  The movement of the student is mirrored through an abstract skeletal figure and instructions are provided through a virtual teacher.  Posture deviation is measured in the following way: Torso misalignment is calculated by comparing hip center joint, shoulder center joint and neck joint position with an ideal posture position retrieved in an initial calibration procedure. Pelvis deviation is expressed as the xz-rotation of the hip-center joint. The posture deviation is sonified via a varying cut-off frequency of a high-pass filter applied to floating water sound. The posture deviation is visualized via a curve and a rigged skeleton in which the misaligned torso parts are color-coded. In an experiment with 9-12 year-old dance students from a ballet school, comparing the audio-visual feedback modality with no feedback leads to an increase in posture accuracy (p < 0.001, Cohen's d = 1.047). Reaction card feedback and expert interviews indicate that the feedback is considered fun and useful for training independently from the teacher.},
 address = {Copenhagen, Denmark},
 author = {Esben W. Knudsen and Malte L. Hølledig and Mads Juel Nielsen and Rikke K. Petersen and Sebastian Bach-Nielsen and Bogdan-Constantin Zanescu and Daniel Overholt and Hendrik Purwins and Kim Helweg},
 booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression},
 doi = {10.5281/zenodo.1181422},
 issn = {2220-4806},
 pages = {71--76},
 publisher = {Aalborg University Copenhagen},
 title = {Audio-Visual Feedback for Self-monitoring Posture in Ballet Training},
 url = {http://www.nime.org/proceedings/2017/nime2017_paper0015.pdf},
 year = {2017}
}