This is a demo to show how Angry Birds (its web version) can be played with a Kinect by combining Skeltrack and OpenCV.
Skeltrack is used to track where the user's hands are, their position is used to move the mouse. OpenCV is then used to get the ConvexityDefects from the hands' image which gives us the number of fingers; this can be used to know if the hand's palm is open or closed and simulate a mouse press according to this state.
The end result is not yet polished but it gives a good idea of what can be done with 100% Free Software.
This video shows the Skeltrack Kinect example and has two parts: the first one shows how the regular skeleton joints tracking produces some jitters; the second part shows hows this jittery is corrected by enabling the new "smoothing" feature with a factor of 0.25.
The smoothing algorithm implemented is Holt's Double Exponential Smoothing. For more info about this feature and Skeltrack's 0.1.4 version, check out:
This is a sketch of the N9 app Butaca running on a tablet. The main goals were to test the horizontal navigation and specially to mature a method to quickly create interactive sketches.
This one was done with just ~160 lines of QML and a bunch screenshots of the N9 application. The method (you can think of it as an interactive collage) would work just as well with nice ad-hoc mockups, but those take more time and the point was to keep it quick.
This is an interactive aquarium that runs directly in the browser. It uses Processing.js for drawing graphics over an HTML5 canvas. Users interact with it by touching close to the proyected image on the wall, creating water ripples that attract the fish at the location where the disturbance was made.
At server side, a Kinect device sensor managed by GFreenect tracks actions ocurring at a certain threshold close to the wall, and determines the contact point using OpenCV. This point is then sent to the browser over a WebSocket managed by EventDance.
The water splash sounds are played with a set of HTML5 audio elements.