This video demonstrates an experimental projection mapping application that is part of the Projector Camera Toolkit developed during Guest Research Project v.1 at YCAM Interlab.
First, the scene (the skull) is scanned using gray code structured light. This scan data is decoded into a 3D point cloud. This is possible because the camera and projector are calibrated and the position and orientation of each is known in advance. At :33 there is a brief demo of editing a GLSL shader in realtime, making a minor change that changes the width of the projected lines. Then the skull itself is shown again, with the projection mapped visuals.
Music: "Hoofbeat" by Dustmotes soundcloud.com/dustmotes/hoofbeat-disquiet0002-duet