Homunculus is a video self-portrait that explores facial expressions and physical performance. In it, I use the position of my body to puppet a 3D model of my own head. Each limb is mapped to a particular part of the face that plays a role in determining the emotional expressiveness of a facial expression: my hands control my brows, my knees control the corners of my mouth, etc.
The result is that small facial movements that distinguish different emotional expressions — a raised eyebrow, a curled lip, a brow furrow — get amplified into the large scale movements of my whole body. To achieve particular expressions such as surprise, contentment, anguish, I'm forced to contort my body into absurd positions that bear little expressive relationship to the emotion being expressed by the puppet.
The process of designing the interface, of configuring the precise mapping between skeleton joints and areas of the 3D model, also required intensive attention on which parts of my face move when making each facial expression. And likewise the process of hand-building the 3D model of my face required diligent attention to the construction of my face.
Technically, the application access the skeleton data via github.com/Sensebloom/OSCeleton and it loads up the 3D model (created in Cinema 4D) as an obj file. The code is available on GitHub: github.com/atduskgreg/Head-Puppet Here is a good tutorial for getting up and running with OSCeleton on OS X: tohmjudson.com/?p=30
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?