ProprioLoop is an interactive motion visualizer designed to train a mover's attention to consciously consider non-visual sensory input in the movement decision-making process. With visuals that amplify and highlight physically sensible differences in movement, such as sudden or sustained changes in momentum, the audience gains access to the more intimate, subjective experience of the mover.
We aimed to explore decision making in the movement improvisational setting through motion capture and visualization. We hoped to make the proprioceptive and equilibrioceptive experiences of the dancer- the physical sensation of the movement- visible to an observer to be able to access the dance and cognitive processing involved more intimately.
Our team had recorded motion capture data for improvisations focused on three decision-making models. For the first improvisation, the mover made choices to follow a continuous line of momentum for as long as possible. For the second, the mover made choices to transform the momentum into new/expected directions. For the third improvisation, the mover made choices to move away from any developing momentum, which required a constant input of energy into the system. These choice-making protocols were used for their inherent alignment between physical facts about the movement and non-visual sensory input for the mover. The mover was required to notice the sensation of momentum, an observable physical property, and choose to follow, defy, or transform it. The choice dictated specific kinds of accelerations, which were experienced through kinesthetic sensibilities as well in a non-visual feedback loop for the mover.
As we learned more about moving in the motion capture system and working with the data, we decided it could be useful to create an app to visualize physical parameters of human movement like momentum, center of mass, and acceleration for dancers to use as an improvisational training tool. Dancers could use the visualization to determine how accurate and honest they were being to choices based on these parameters.
We will continue to develop meaningful, rich audio-visual representations of the movement, along with a user interface that will help dancers.
Choreographic Coding Lab | New York 2015