I've just finished the first prototype of a project I've been working on - a processing based app which lets Flash/Air apps send midi (notes and CC) data, and I've put together a video showing an early app dermo.
The demo shows the Processing based Midi server (blue app in the bottom right), and also a Flash Webcam based synesthesia type audio tool. The Flash app takes still images from a webcam, and analysies them for 4 colours within them. These colours are then mapped to a range of notes, so for example, a white image will send a high note, whilst a black image will send a low note. These notes can then be played back by sending out Midi data, and the corresponding images are displayed.
As an extra feature - the "activty level" (amount of movement) in the webcam can be monitored, and sent as CC data.
Most of the sounds coming out of this demo are pretty nasty, but it shows the basic idea of the Flash Midi Server - which could be used in loads of different ways. Eventually I'd like to release a suite of Air apps - interective and generative music toys, using various Flash capabilities - and invite other people to contribute their own.
Hopefully there'll be some more polished versions coming soon.
More info coming soon at LawrieCape.co.uk/theblog
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?