28 February 2014 Interactive projection for aerial dance using depth sensing camera
Author Affiliations +
This paper describes an interactive performance system for oor and Aerial Dance that controls visual and sonic aspects of the presentation via a depth sensing camera (MS Kinect). In order to detect, measure and track free movement in space, 3 degree of freedom (3-DOF) tracking in space (on the ground and in the air) is performed using IR markers. Gesture tracking and recognition is performed using a simpli ed HMM model that allows robust mapping of the actor's actions to graphics and sound. Additional visual e ects are achieved by segmentation of the actor body based on depth information, allowing projection of separate imagery on the performer and the backdrop. Artistic use of augmented reality performance relative to more traditional concepts of stage design and dramaturgy are discussed.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Tammuz Dubnov, Tammuz Dubnov, Zachary Seldess, Zachary Seldess, Shlomo Dubnov, Shlomo Dubnov, "Interactive projection for aerial dance using depth sensing camera", Proc. SPIE 9012, The Engineering Reality of Virtual Reality 2014, 901202 (28 February 2014); doi: 10.1117/12.2041905; https://doi.org/10.1117/12.2041905

Back to Top