Traditionally it's been tough to work expressively with synthesizers. Using Kinect's body tracking, it's possible to forgo a physical interface like knobs or keys and build something that feels natural. This video shows real time tracking data turned into midi notes and control values, which allows full live control of instruments.

Even at these early testing stages, I'm excited at the possibility of expressively improvising with software instruments.

My next goal is to work on looping and layering of separate parts to build complete songs, and move through phrases seamlessly.

Open NI
Ableton Live/NI Massive

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…