A framework for bringing 3D mesh data from Rhino (via Grasshopper) into Processing_0195.
This demo uses FaceOSC to control a 3D puppet, using eyebrow and mouth data to modify the ears and jaw. The code synchronizes the movements of your head, mouth, and eyebrows to control the expressions of a tessellated three-dimensional model. The model was first developed in Rhino3D, then translated to the Processing programming environment through Grasshopper, then connected to Kyle McDonald's FaceOSC app through Oscp5. PeasyCam is also used for navigation.
The mesh from Rhino is exported as .txt files that contain the model's UNIQUE VERTICES, as well as the ORDERED VERTICES needed to construct each face. .txt are then streamed from the Rhino/GH file into the sketch's data folder. Pivot points, as well as points to be rotated are specified in Grasshopper, then exported to Processing. While the current puppet is highly tessellated, one can begin to create additional faces (and unique vertices) to give the model higher detail.
*Inspired by the work of Karolina Sobecka
Download the project file here: http://golancourses.net/2012spring/01/23/madelinegannon-3d-wolf-puppet-with-faceosc/