I was asked by a colleague Jay Morgan at Amnesia Razorfish to build a Kinect particle cloud processing project for the TEDx Sydney event. I decided to make the project open source and release the code. This is the page it will be available on however the code will not be available until after the event.

This video is part of a number of bumper videos selected for TedX Sydney 2012. It was created by using crowd recording to create particle could interactions. The video used an XBox Kinect and was rendered live. The concept illustrated the fact that during an event like TED there is the potential of making strong connections between the people there. This video work visualizes these connections from real crowd interactions.

Kinect based on examples & libs:
shiffman.net/p5/kinect/
ubaa.net/shared/processing/opencv/
Requires an XBox Kinect.

For more information on the event click here:
tedxsydney.com

I have released the code on OpenProcessing
openprocessing.org/sketch/63288

Credits
Visual Code by Rhys Turner
Creative Direction by Jay Morgan
Produced by Meghan Petersen
Sound Design & editing by James Ashbolt

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…