Performed: Transitions 2010
Center for Computer Research in Music and Acoustics
September 16, 2010
TweetDreams is an instrument which uses real-time data from Twitter to create an audio-visual performance. The instrument builds networks of related tweets which act as tree-like sequencers. New tweets trigger percolating sequences of melodies, where each tweet’s melody is a mutation of its neighbors’. A visual representation of these relationships pulses in synchrony with the music.
TweetDreams is performed by three different groups: The performers shape the performance. They select the search terms used to find tweets, and tune parameters that control how tweets are displayed and mapped to music.
Audience members can play the instrument by tweeting. For each performance there is a "local search term", and tweets which include this term are given special precedence. The audience is encouraged to explore the instrument by conversing over Twitter with the performers
and each other.
Lastly, the world is an unwitting player of the instrument. Anyone who tweets with one of the search terms during a performance joins the piece and contributes to the networks of meaning and sound.
TweetDreams is implemented in software and rendered using laptop computers. It uses Python, Processing, and ChucK. The music is rendered in stereo or multi-channel, and visuals are displayed on a large screen.
For the performers, the "virtuosic players" of the instrument, the control interface is a combination of midi sliders and command line input. For the audience the interface is their own hand-held computing device and Twitter client and the audio and visual feedback they receive.
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?