Using a software application I designed via the Processing environment called "Kandinsky's Kamera", I have created an improvisational music and video piece. In general, the music I play in the piece is in response to what artistic visuals I saw on the screen. My improvisations were primarily motivated by my reactions to the visuals on screen. The audio and video is recorded directly by my laptop's built-in devices. The software is interpreting the audio as it records it. Amplitude is translated to brightness and zoom. Frequency is translated to hue.
The software application is in beta, but may be distributed upon request.