A test in motion detection in Quartz Composer 3.0.
UPDATE: a new version of this written in C++ can be found at vimeo.com/1219327 with more advanced visualisations
The music is all generated in real-time by me waving my fingers, hands and arms around (or in fact any motion) in front of a standard web-cam. No post-processing was done on the audio or the video.
The concept is by no means new, but still fun nevertheless - and I'm quite happy with this implementation. I'm using a very simple frame difference technique and generating midi notes based on where-ever there is movement (actually, as QC3 cannot send midi notes I had to send the data as OSC and use OSCulator to forward them as midi).
I set up a few scales so only the notes in the chosen specific scale would trigger, and in this video I demo some: chromatic (all the notes on a piano), diminished (for a nice tense feeling), pentatonic (for a nice bluesy vibe), and Zirguleli Hicaz (a turkish scale which is by far my favorite - comes in around 1:33, if you get bored at the beginning skip to that bit).
What I like about this example is that there is no special hardware or user configuration needed - its just a standard webcam looking at a standard me with no special clothing or anything. Anyone with a webcam can just load the software and start playing. Of course a more specific version (e.g. IR Camera filming someone wearing gloves with reflective finger tips) would definitely provide much more control, and I think I will try that someday too.
More info and source code at memo.tv/webcam_piano
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?