A responsive audio-visual installation exploring the usefulness of an iOS device for interaction. This device controls which animation is displayed (generated by a remote computer), and allows the user to create or delete animations. The orientation determines which instrument is selected (drum machine, polyphonic synth, polyphonic timestretcher, polyphonic time freezer), or to stop audio altogether. Each touch layers animations on the screen, and adds an extra voice to the instrument (up to eight touches supported). In the case of the synth, the accelerometers x, y, z are used as pitch, volume, and spatilization modifiers. For the rest, touch position and length of touch affect parameters such as volume/spatilization, speed/pitch, probability of being triggered etc. These touches are represented to the user by a ball of random lines.
Apologies for the poor video quality. Also the iOS simulator doesn't support shake gestures, accelerometer data, and only two touches, so not much can be inferred from this video. Hence the synth etc. couldn't be demoed. Should be set up in a space in the New Year though.
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?