octal_hatch (2003) is a series of works based on materials generated using a live video feedback system. The setup consisted two antiquated video mixers (equipped with various wipes and internal effects), assorted cameras, monitors and a laptop processing and feeding footage through itself and into the system. These source materials were then edited, layered and reprocessed using various softwares and hardware techniques.
The idea first presented itself to me when Matmos asked me to play contrabass on the track "Snails and Lasers for Patricia Highsmith" from the album "The Rose Has Teeth in the Mouth of a Beast". From them I got the idea of making an abstract work that had conceptual links to someone who had deeply influenced my life and work. In my case I decided to do an "abstract portrait" of the composer and architect Iannis Xenakis.
The sonic materials used in the works have direct links to Xenakis' electronic music in that they mainly use his UPIC System and an implementation of GenDy written by Alberto de Campo in the software programming language supercollider.
The UPIC sounds were generated years before, performed live to DAT tape while I was working in France as music assistant at Les Ateliers UPIC. The GenDy code uses a stochastic algorithm that Xenakis invented and called "dynamic stochastic synthesis." Also to note, currently the hardware dependent UPIC is replaced in my studio setup by the UPIX, a software only version created at CEMAMu just before they were sadly shut down.
The UPIC (an acronym of Unité Polyagogique Informatique du CEMAMu) is a drawing tablet linked to a computer, which has a vector display. The user draws waveforms and volume envelopes on the tablet, which are rendered by the computer. Once the waveforms have been stored, the user can compose with them by drawing "compositions" on the tablet, with the X-axis representing cumulative duration, and the Y-axis representing pitch. The compositions can be stretched in duration from a few seconds to an hour. They can also be transposed, reversed, inverted, and subjected to a number of algorithmic transformations. The system allows for real time performance by moving the stylus across the tablet.
The title comes from the name of one of the UPIC pages "performed" in the work, a sort of cross hatched image which to me at the time resembled an eye.
Portions of the work premiered in Bristol England as a part of the Sonic Arts Network "Connectors" event, and the parts of the version posted here was first screened at The Ninth Annual Activating The Medium Festival in San Francisco, CA.
"Beneath the level of the note lies the realm of sound particles. Each particle is a pinpoint of sound. Recent advances let us probe and manipulate this microacoustical world. Sound particles dissolve the rigid bricks of musical composition-the notes and their intervals-into more fluid and supple materials. The sensations of point, pulse (series of points), line (tone), and surface (texture) emerge as the density of particles increases. Sparse emissions produce rhythmic figures. By lining up the particles in rapid succession, one can induce an illusion of tone continuity or pitch. As the particles meander, they flow into liquid-like streams and rivulets. Dense agglomerations of particles form clouds of sound whose shapes evolve over time." -Curtis Roads
POINT LINE CLOUD is a collection of audio and video collaborations between Curtis Roads and myself, it has been a ever shifting project over the years which constantly continues to evolve. The first performance of the materials that grew into the project was in 2001 at a concert with Autechre and Russell Haswell in Los Angeles. Since then it has been performed in many diverse venus around the world.
The three excerpts presented are:
This work contains in part visual source materials provided by Matthew Marsden that were further layered and processed using various digital softwares.
Volt air pt. 3
The source material was generated using the analog video synthesizer the Sandin Image Processor located at the School of the Art Institute of Chicago. Thank you to Brett Williams and Edward Rankus who at the time helped me dig deeper into the IP.
Half life pt. 1 Sonal atoms
Was created using only a few seconds of footage that was then edited, layered, processed and re-processed to create the basis for the work. Curtis' book MICROSOUND had a profound influence on the conception of how to edit and construct this work, at times editing the video to the sound on a frame by frame level.
The SCAN PROCESSOR STUDIES are a collection of works by Woody Vasulka & Brian O'Reilly.
The full work is of total approximate duration of 45 minutes, with sections of various lengths, textures, and dynamic qualities.
The project first started while Woody and I were working on different commissioned projects at the ZKM (Zentrum für Kunst und Medientechnologie, Karlsruhe Germany). He and Steina on the exhibition MINDFRAMES and Garth Knox and myself on the DVD and performance SPECTRAL STRANDS: FOR VIOLA AND VISUALS. Woody, Steina, Garth and I spent many nights screening works for moving images, playing music, and cooking, enveloped in the huge ghost town mood the ZKM's kitchen took on at night. During this time there were passionate discussions about video synthesizers (mainly my love for the Sandin Image Processor), and how Steina's VIOLIN POWER had a huge influence on Garth's and my new series of works.
The source materials were generated by Woody using a Rutt-Etra Scan Processor in the 1970's and sat on a shelf for years, having been recently digitized. Woody came into my studio one day and asked me if I would be interested in using them to work on a collaboration, and the project began from there...
The works use sources excavated directly from the output of the Scan Processor, as well as further manipulations using Tom Demeyer's ImX software, developed with input from Steina. Extensive editing and layering and additional augmentations were done using Phil Mortons IP. The Sound was generated (mostly) by custom software developed by Chandrasekhar Ramakrishnan and myself called NETHER GENERATOR, which sets up a number of complex real time feedback networks filtered and processed by various means.
SCAN PROCESSOR STUDIES was first exhibited as an installation in the ZKM's MINDFRAMES exhibition.
The source materials from Woody's original experiments with the Scan Processor have also been used in conjunction with further processing on my part to create the base materials for other works, including a three screen version of Woody's piece GRAZING and the work LEVEL & DEGREE OF DARK.
Leech is a multi-media composition that explores the moral and physical dimensions of music piracy. Leech includes components of sonification and music composition, using the actual mechanisms that enable BitTorrent downloads as mined data for real-time algorithmic sound production. Network data is mapped in musically and visually meaningful ways to produce an experience that embodies the look and sound of piracy. Furthermore, the actual music being pirated is itself used as a resource for audio processing and music composition. Performed in real-time, the composition provides multi-factorial insight into the world of music piracy.