Kinect used in a different, rather simple way. After calibration of the static background, it is able to detect when a new object (paperball) enters a certain corridor between device and wall. It will also detect when it leaves this corridor (shortly before hitting the wall) and then trigger an event to visualization and audification.

This experiment was created using Processing, vvvv and PureData. Using the depth data from the age-old CLNUI platform and projection mapping, no skeleton tracking or any other high-level stuff.

Kudos to gilbi for the vvvv gDrip plugin and to various ink blot splatter brushes from the web.

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…