Multiple spheres visualize the sound of [LL005] sysEx in this video. They heavily react to the sound, but also move a bit independently - just like the nature of sysEx. There are also lots of video glitches, which are consciously put in to visualize the music. This is the first music video I've done myself for my own music.
This is a video I did for Smartslab as part of an installation in The Building Centre. It's supposed to be about the future of Smartslab (architectural displays with hexagonal pixels).
The eyeball tracking is done with an eyebox (see google) which only works in perfect conditions. Hand tracking is done in processing with 2 XBox live cameras. Everything is piped into quartz composer using a custom built network socket plugin
a hack i did in 2007 for cocoacollider, but now updated using the standard SCNSObject in supercollider.
code available here... http://fredrikolofsson.com/f0blog/?q=node/442
supercollider does sound synthesis and at the same time talks to google earth via applescript.