Following in the footsteps of Memo et. al. I am trying to create an ambient music generator using video input to seed and trigger the sounds.
Even following Memo's quartz composer patch it has taken me a couple of days to understand how it all works. Finally on the train to work this morning I got something working.
Here we see:
1) Standard Feed.
2) Detection grid activation
3) Grid sensitivity tweaked
4) Disable normal feed (looks cool think I'll play with that some more)
5) Showing the difference layer that is being tracked. Edge detection really.