Cell phone-based beat sequencer, a small-crowd improvisation interface. Developed with a webcam, projector, and Open Frameworks. Squares of the video are thresholded for average brightness, and when the line reaches them MIDI notes are sent to Reason.
This was mostly inspired by "I Eat Beats" vimeo.com/625464 which could only support two or three hands simultaneously -- I wanted to allow for more people.
This first performance/demo is from a local open mic. Next time I'd like to get some more musicians involved -- maybe a bassist and kit drummer improvising with the crowd...?
The source is temporarily available here: rpi.edu/~mcdonk/random/iSeeBeats.zip
Webcam/computer vision + beat sequencer seems to have a lot of fun and easy variations. Others to try: top down, tracking people; tracking faces; tracking ARtags; tracking inanimate objects or animals...