UPDATE: there is a slight issue with the audio tracking in this video. We have fixed it and posted the improved version here: vimeo.com/18020629
On 11/4, Microsoft released their new Kinect IR camera, which allows gamers to use their bodies as controllers by combining an IR depth camera with an RGB camera. 5 days later and some bounties/hardware/drama later, the first set of open source drivers were released. Since then, the world has seen everything from point clouds to puppets to nipple tracking. This talk covers the technology that makes the camera great, and the innovation that makes the community great, and what we can expect to see in the future from this new hardware.
Kyle Machulis is an engineer working on projects ranging from haptics to driver reverse engineering to audio research to teledildonics (which is just a fancier word for haptics that people will actually pay attention to). He is currently part of the group leading the OpenKinect community in making open source, cross platform drivers for the Microsoft Kinect Camera.
Links in the presentation: