(00:00) The ShadowEngine: Cinematic and Object Control
(00:11) Unity editor customised as a play and performance space.
(00:19) Remote: Cinematic screen: Iris, Camera controls, FX controls
(00:27) Remote: TouchOSC - Puppet controls. Multi-touch area. Position, rotation and scale. Resting position (hang)
(00:34) Remote: Camera controller - position and zoom. Other custom FX controls (doors, smoke, flame)
(00:52) Scene View: Puppets can be arranged and moved in the scene view.
(00:58) Custom UI: performs quick selections and actions.
(01:10) Scene view panning and zooming does not effect the projected view.
(01:21) Physics Configuration: the bird figure (in this iteration) has high drag/angular drag settings. Movement should dampen to stillness quickly.
(01:31) Remote: An iPad running TouchOSC sends signals to the camera fading script.
(01:38) Remote control: some state changes are automated. Here the fade in/out happens on a toggle.
(01:46) Remote Control: the Iris controller. Touch control adds a nuance to tempo and pacing.
(02:00) Remote Control: the iris wipe centre can be positioned by a remote camera operator/director.
(02:11) Rigging: the bird is an early example of a 'spring network' where control points drag particular points.
(02:19) Multi-touch: TouchOSC was the first solution I had to multi-user multi-touch controllers. There is a complexity to that solution. Also, direct mouse control can be used.
(02:25) Remote: Cinematic control can add blur, motion blur, and an experimental depth of field effect.
(02:34) The blur (when used dynamically) can simulate a number visual anomalies of the surface/shadow interaction.
(02:39) Exploration: techniques that blend physics-based-animation, pose-to-pose and interactive control are a major focus for the current study.
(02:46) Motion blur.
(02:47) Good physics stability.
(02:54) VFX: Colour, Greyscale and Monochrome modes
(03:42) VFX - Inversion
(03:53) VFX: Noise and movement effects
(04:13) VFX: Colour effects: Tints and tones
(04:33) VFX: Light Shafts offer a neat simulation of shadow projection artefacts.
(04:53) VFX: The light can be set to follow a specific object, creating an illusion of a second moving spotlight.
(05:07) Movement and Control: character rotation has proved tricky to solve. Here scaling the character * -1 on the X-axis creates an illusion but lacks grace and visual flow.
(05:23) Depth and Perspective: The Orthographic to Perspective toggle.
(05:44) Remote Control: The camera controller. Dollying, zooming, panning. These are filmic conventions often emulated in shadow theatre, although there normally isn't a concept of the camera, but the screen.
(05:54) VFX and Control: Scaling objects, combined with blur could simulate the dynamic qualities of objects, light source and distance from the screen. However, this operates on 'global' screen space.
(05:59) Coding: The interpolation of the camera movement between touches is neat and an effect required elsewhere in the system.
(06:04) Issue: a big problem with TouchOSC: you have to guess how the blank control space is mapped to the screen space. The solution explored in the future iterations saw the recreation of the remote control interface in a separate Unity project where mirrors of the puppet objects themselves are present on the control surface.
(06:22) Smoke and fire.
(06:39) Remote control: Custom controllers can be quickly created and assigned to control effects. Here, smoke and flame parameters can be changed.
(06:50) Barrel distortion. Chromatic aberration. These neat effects were chosen as they simulate optical phenomena although their application is very dynamic and controllable in digital space.
(07:00) The offset effect is reminiscent of space between object and surface and variations in the position of the light source. A creative source of play in analogue shadow theatre.
(07:29) There is a quasi-anaglyphic quality to this. There are digital image processing effects that would render anaglyphic edges to the objects. Attention would need to be made on object placement, offsets and the use of a perspective camera.
(07:39)Image production: the ability to construct complex, interesting images is an unforeseen outcome of the system.
(07:53) Remote Control: Reset buttons. Return to sane starting values. Each animatable object requires a 'resettable' state.
(07:59) Movement controllers: Different layouts exploring different ways of separating and assigning first, second, third, fourth (and more) touches.
(08:08) Issue: critique of this particular implementation of multi-touch.
(08:20) Configuring Touch: each touch is mapped to a control point. The control is the anchor of a spring joint attached with a strong attraction to a rigid-body puppet part.
(08:25) Scaling: transforming body parts.
(08:45) Issues with scaling.