Here I am viewing the thesis project using a simple projection. Movement of the image is driven by the movement of my face in relationship to the screen. Details of the tracking can be seen in other videos.
This is a video documenting the intended path a viewer might follow when experiencing the Thesis Project. This is a real time capture out of Unity 3D using all baked lights from the Beast lighting engine. Even though the camera starts out moving to the right, the viewer has the ability to explore the narrative in either direction at any time. All models, lighting and texturing were completed by me.
This is the face tracking system using a basic web camera to influence the overall interaction of the digital camera within the game engine. The pFaceDetect Algorithm for Processing by Bryan Chung (Open CV) was altered to manage multiple viewers. The Algorithm recognizes faces and draws a simple box around the face. In this example all red boxes are referencing availabe faces in the screen area. The Green box represents who is the "Active" or "Dominant" viewer, which is defined by the size of the face. Those closer to the capture device will cover larger pixel space. Here we see as the viewer change their active roles viewing the work, the tracker switches to the new closer participant giving over control.
The green zones to the left and right are representative of the space defined as the threshold for the animation trigger. As the center of the face reaches the bounds of the threshold the camera within the Unity scene is moved along a pre-defined path. Depending on how far the center of the face moves into the threshold area the speed at which the camera translates along the path will increase.