Courtney and I recently finished our new projection mapped live show. We decided to take footage from our debut shows and make this little demo video to share with y'all on the interweb.
All audio and video clips are triggered and manipulated in sync from our MIDI controllers. Our silhouettes are masked/tracked in real time using a Kinect and a bunch of neat software. So much fun to play live!
For all of my fellow A/V geeks out there, here's a quick list of what I use to create and run the video side of this show:
/// Hardware ///
- Hitachi CP-A100 short throw projector
- Microsoft Kinect for silhouette masking and joint tracking
- MacBook Pro 2.4 GHz i7
- MIDI controllers: 2 SPD-S drum pads, a joystick controller, and a SoftStep foot controller
/// Software ///
- Resolume Arena 4 (MIDI synced to Ableton Live)
- Adobe After Effects & Photoshop
- Quartz Composer (custom compositions using Syphon and Synapse patches to feed silhouette tracking info in/out of Resolume)
- IR Mapio
- Syphon QC & FFGL
Next stop: 3D motion graphics & hopefully some touring. Wish us luck!
Special thanks goes to:
All the wonderful people who donated to us on Kickstarter & made this possible.
Justin Bergonzoni, Todd Hailstone, and Mike Thompson for the footage.
Lucas Bang for helping me research projection mapping and Kinect masking.