Commissioned by CultureCode, Ed Cater and Matt Jarvis created an audio visual experience from a dataset of their previously commissioned EyeProject.

Using latitude and longitude coordinates, the map is designed to create sound from any location-based dataset. In this beta version, the data represented is animal sighting from the EYE Project where young people geotagged animals in the wild. Melodies are created by playing the coordinates as a step sequencer, with each number relating to a specific pitch. The type of oscillator creating the sound wave relates to the hemisphere in which the coordinate is located. Other layers of audio are created by passing the dataset through simple text-to-speech software, with amplitude of the digital voice controlling pitch, or by using an envelope which allows only percussive elements to be heard. The aim was to create a code which could theoretically be reversed, making it possible to extract the data back from the sounds.

Next steps are to create a spherical interface, so the map (and dataset) can be moved freely, creating a 3D music sequencer where the melodies are created by the dataset, but the order and speed of playback are controlled by a performer.

Ed Carter
modular.org.uk

Matt Jarvis
mattjarvis.co.uk

EyeProject
eyeproject.org.uk

CultureCode
culturecode.co.uk

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…