1. This is the making of video of the interactive installation that the Igalia Interactivity team did last November.

    It shows the early testing application we developed to recognize the gestures, as well as the process of installing it in the Museum für Kommunikation in Berlin and also the final result.

    # vimeo.com/55846845 Uploaded 299 Plays 0 Comments
  2. This is a demo to show how Angry Birds (its web version) can be played with a Kinect by combining Skeltrack and OpenCV.

    Skeltrack is used to track where the user's hands are, their position is used to move the mouse. OpenCV is then used to get the ConvexityDefects from the hands' image which gives us the number of fingers; this can be used to know if the hand's palm is open or closed and simulate a mouse press according to this state.

    The end result is not yet polished but it gives a good idea of what can be done with 100% Free Software.

    # vimeo.com/52250215 Uploaded 2,825 Plays 0 Comments
  3. This video shows the Skeltrack Kinect example and has two parts: the first one shows how the regular skeleton joints tracking produces some jitters; the second part shows hows this jittery is corrected by enabling the new "smoothing" feature with a factor of 0.25.

    The smoothing algorithm implemented is Holt's Double Exponential Smoothing. For more info about this feature and Skeltrack's 0.1.4 version, check out:

    # vimeo.com/44739007 Uploaded 1,433 Plays 2 Comments
  4. This is a sketch of the N9 app Butaca running on a tablet. The main goals were to test the horizontal navigation and specially to mature a method to quickly create interactive sketches.

    This one was done with just ~160 lines of QML and a bunch screenshots of the N9 application. The method (you can think of it as an interactive collage) would work just as well with nice ad-hoc mockups, but those take more time and the point was to keep it quick.

    # vimeo.com/40730591 Uploaded 150 Plays 0 Comments
  5. This is an interactive aquarium that runs directly in the browser. It uses Processing.js for drawing graphics over an HTML5 canvas. Users interact with it by touching close to the proyected image on the wall, creating water ripples that attract the fish at the location where the disturbance was made.

    At server side, a Kinect device sensor managed by GFreenect tracks actions ocurring at a certain threshold close to the wall, and determines the contact point using OpenCV. This point is then sent to the browser over a WebSocket managed by EventDance.

    The water splash sounds are played with a set of HTML5 audio elements.

    Source code: http://people.igalia.com/elima/gfreenect-experiments/

    The fish are a modified version of Ricardo Sánchez's (@nardove) GPL fish pond.

    # vimeo.com/39689650 Uploaded 1,251 Plays 1 Comment


Joaquim Rocha

Igalia stuff!

Browse This Channel

Shout Box

Heads up: the shoutbox will be retiring soon. It’s tired of working, and can’t wait to relax. You can still send a message to the channel owner, though!

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels.