We've taken this exciting opportunity to port our popular DaVinci experience to the Kinect platform. Gestures are used to create objects and control the physics of the environment. Your hands appear in the interface which allows you to literally grab objects out of thin air and move them in the environment. Additional gestures allow you to affect the gravity, magnetism and attraction.
A quick installation prototype Emily and I hooked up with the libfreenect Kinect drivers and ofxKinect. The system is doing skeleton tracking on the arm and determining where the shoulder, elbow, and wrist is, using it to control the movement and posture of the giant funky bird!
Speed project - made in a day using openFrameworks and libFreenect.
Concept and Production by Design I/O
Emily Gobeille - Theo Watson design-io.com
3D depth camera for arm tracking, courtesy of Microsoft and the open source / diy community :)
I was asked to make a couple of interactive toys for the childrens section of Flatpack Film Festival, one uses the Kinect camera to do some simple extruded shapes and some hand tracking (I took the feet tracking out, as I suddenly thought about potential injuries!) The other is a simple floor projection using 2 projectors with softedging.
All made with vvvv
Systems Engineering course project: design an exhibit for the Pittsburgh Children's Museum. This system tracks plastic ball collisions with a Microsoft Kinect and simulates the resulting paint splatter on the wall using a projector and speakers. The wall "resets" every 60 seconds, saving the images to a website so kids can retrieve their art at a later date. The website is located here: http://sites.google.com/site/pitdigitalgraffiti/
ORF #1 is, at the first glance, an interactive space in which music is produced according to the relative positions of the users.
A grid of letters (A,T,C,G) is projected on the floor and sequences are activated when the users enter the space. These sequences of letters correspond to sequences of sounds and the users are permanently redefining the beginning and the end of the messages, and hence the length, timbre, rhythm, pitch and tempo of musical phrases.
The positions of the users define the beginning and the end of each sequence. So if there are two users in space a sequence is generated between them, if there are three users two sequences are generated, and so on. If there is only one user the sequence starts in the user position and ends at the end of the grid. A code of colors identifies each different sequence.
The environment invites the user to explore the space but above all it creates a playful atmosphere where the interaction between users produces musical results and enhances non-verbal communication.
On a metaphorical level, ORF#1 proposes a voyage to the "roots of happiness". The letters projected on the floor are the sequence of 5-HTT, the so-called "happiness gene", a gene that encodes a protein involved in the transport of serotonin, a feel-good chemical in the brain. The users are creating new messages using the original gene sequence as a matrix, but redefining them according to the way they relate to each other. The project proposes, therefore, a poetic reconciliation of nature and nurture, raising the awareness of ourselves through the merging of science and art.
November 23-24, 2011, audiovisual studio, DeCA, UA, Portugal
by Paulo Maria Rodrigues in colaboration with Rodrigo Carvalho