This is a rendering of motion capture data for Pygmalion--the play that the puppets are supposed to act out. One can view this data as being the "reference trajectory" that the puppets are supposed to track.
Credits: Elliot Johnson, Elizabeth Jochum, Peter Kingston, Magnus Egerstedt, Todd D. Murphey
This video shows a side-by-side comparison between a dynamic simulation of a marionette, and the same motions played on an experimental setup. The simulation was generated by animating the robot paths in Blender, and then feeding the paths into our custom dynamic simulation software "trep". In the experiment, a Kinect is used for feedback, and the loop is closed around the robots' kinematic paths. No measurements of the marionette's state are incorporated into the feedback.
Credits: Elliot Johnson, Alex Ansari, Elizabeth Jochum, Jarvis Schultz, Todd D. Murphey
A rudimentary, hand-programmed demonstration of the current robotic puppet system actuating a full-size skeleton. The robotic "puppeteers" have magnetic wheels powered by DC motors. A thin sheet of plastic is stretched tightly from all edges to produce the "stage ." An un-powered magnetic idler is placed on the top surface of the plastic for the robot to cling to.
Credits: Jarvis Schultz, Lanny Smoot, Todd D. Murphey
An offline, infinite-dimensional optimization procedure is used to generate system inputs (as a function of time) that will cause a simulation of the trajectory of a suspended mass to follow a reference trajectory. This is then executed on an experimental system, and the results are compared to the simulation.
A collection of videos related to the Robotic Marionette project at Northwestern University. This is work from Todd Murphey's research group which is part of the Neuroscience and Robotics Lab. More about the project can be found at http://nxr.northwestern.edu/research/scalable-algorithms-physical-systems/marionettes