1. An Instrument for the Sonification of Everday Things

    01:10

    from Dennis P Paul / Added

    136K Plays / / 45 Comments

    This is a serious musical instrument. It rotates everyday things, scans their surfaces, and transforms them into audible frequencies. A variety of everyday objects can be mounted into the instrument. Their silhouettes define loops, melodies and rhythms. Thus mundane things are reinterpreted as musical notation. Playing the instrument is a mixture of practice, anticipation, and serendipity. The instrument was built from aluminum tubes, white POM, black acrylic glass, a high precision distance measuring laser ( with the kind support of Micro-Epsilon ), a stepper motor, and a few bits and bobs. A custom programmed translator and controller module, written in processing, transforms the measured distance values into audible frequencies, notes, and scales. It also precisely controlls the stepper-motor’s speed to sync with other instruments and musicians. More Information: http://dennisppaul.de/an-instrument-for-the-sonification-of-everday-things/

    + More details
    • Pure Data read as pure data - 2010

      12:30

      from N1C0L45 M41GR3T / Added

      19.8K Plays / / 11 Comments

      Pure Data read as pure data is an audio visual trip through the back of the binary code, and its hidden qualities: structure, logic, rhythm, redundancy, composition... In this video version, as a tautological process, the content of the Pure Data application is read as pure data, directly displayed as sounds and pixels. A direct immersion in the heart of data flows. Based on Pd version 0.42.5 extended (Mac OS X Intel release) http://puredata.info/downloads Extrude function coded by Nicolas Montgermont http://nim.on.free.fr/ http://artoffailure.org/ Rework of a series started in 2002: http://peripheriques.free.fr/audio/between01_live_ecm-gantner_2003.mp3 http://peripheriques.free.fr/audio/between01_live_le10neuf_2003.mp3 http://vimeo.com/22040561 As an installation, the machine investigates its own hard drive's content. More info: http://peripheriques.free.fr/blog

      + More details
      • Quotidian Record

        02:04

        from Brian House / Added

        74.9K Plays / / 9 Comments

        Quotidian Record is a limited edition vinyl recording that features a continuous year of my location-tracking data. Each place I visited, from home to work, from a friend's apartment to a foreign city, is mapped to a harmonic relationship. 1 day is 1 rotation ... 365 days is ~11 minutes. http://brianhouse.net/works/quotidian_record

        + More details
        • PHENAKISTOMIXER 3.0

          01:24

          from Miss Take / Added

          7,783 Plays / / 7 Comments

          Phenakistoscope was an early animation device that used a spinning disk of sequential images and the persistence of vision principle to create an illusion of motion. Original phenakistoscope discs had slits cut into them and had to be viewed using a mirror. Phenakistomixer appropriates this by precisely synchronising disc rotation with the shutter of a video camera to achieve similar effect, and is used as a live visual performative tool. Phenakistomixer version 3.0 is inspired by early 30s visual synthesizer Variophone in which optical sensor linearly scans monochromatic plates and translates reflected light intensity to sound waves. Concept and Animation: Vesna Krebs Programming and Sound: Borut Kumperscak Sound engine: Berkan Eskikaya, Louis Pilford

          + More details
          • vinyl+ • Expanded Timecode Vinyl

            02:19

            from Jonas Bohatsch / Added

            11.2K Plays / / 7 Comments

            vinyl+ Expanded Timecode Vinyl 2009/10 Project description: vinyl+ is an interactive installation, experimenting with the expansion of timecode vinyl. Virtual objects are projected onto the surface of a white record and come to life when the record is played. Their behaviour changes depending on the rotational speed of the record as well as the position of the turntable‘s needle. The vinyl acts as the screen, interface and apparent carrier for generative audiovisual software pieces. The combination of turntable, computer and projector results in a new device, oscillating between analog and digital, hard- and software. Users are encouraged to spin the record for- and backwards and to carefully reposition the needle. Exhibitions: 2009: Alias in Wonderland, Vienna, Austria 2010: EMAF, Osnabrück, Germany FILE, São Paulo, Brazil NEWAIR, Vienna, Austria NODE, Frankfurt, Germany 2011: Cloud Sounds, NIMK, Amsterdam Supported by City of Vienna/Department of Culture (Wien Kultur). Also thanks to Native Instruments for support! For more info visit http://jonasbohatsch.net Yes, we had some problems with the focus when shooting this video...

            + More details
            • Algorithmic Menagerie

              08:52

              from Raven Kwok / Added

              4,818 Plays / / 6 Comments

              See http://ravenkwok.com/algorithmic_menagerie/ Algorithmic Menagerie is a continuation of and the MFA thesis work of my long term research exploring artificial life and self-organization in the field of computer-based generative art. Programmed in Processing, Algorithmic Menagerie is an interactive virtual environment inhabited by algorithmic creatures. These creatures with dynamic cellular structures are created using various methods of finite subdivision on geometric objects, and exhibit different kinds of biological interactions with each other, reaching an equilibrium within the simulated ecosystem. Audience participants are invited to intervene or interact in the life processes. Algorithmic Menagerie was exhibited at The Curtis R. Priem Experimental Media and Performing Arts Center (EMPAC) Studio 2 from March 27th to March 29th, 2014. Phew...This has been quite a long way: In 2012, I created EDF0 (https://vimeo.com/43752422) and 115C8 (https://vimeo.com/45569702) as the early pieces of my Algorithmic Creatures series based on finite subdivision. In May 2013, 18F44 (https://vimeo.com/63090665) was created originally as a further structural study based on EDF0, but later became an independent creature member, also a linear cinematic piece including multiple life stages of the creature. In September 2013, I integrated more behaviors into Algorithmic Creatures series, and experimented on various visual combinations based on multiple creatures living as a group or herd in 1194D (https://vimeo.com/74877028). In December 2013, ECO (https://vimeo.com/96664141) was created, combining simulated biological interactions between different creatures. For a prior technical test for my thesis show, I also tested several ways of projections' setup (https://vimeo.com/81868814) at EMPAC Studio 2. In February 2014, I worked with my friend and collaborator Kelly Michael Fox (http://kmichaelfox.com/) to implement a sonification experiment on 1B5F1 (https://vimeo.com/88504368) through OSC protocol and Super Collider. This project has finally come to an end, or has it? :) Special thanks to: Collaborator: Kelly Michael Fox; Advisor: Prof. Shawn Lawson; Thesis Committee members: Prof. Michael Century, Prof. Ben Chang and Prof. Dennis Miller; EMPAC Staff: Dave Bebb, Eric Brucker, Geoff Abbas, Ian Hamelin, Mick Bello, Ryan Jenkins, and Todd Vos. ____________________________________________________________ Aug.2014, featured on The Creators Project. (thecreatorsproject.vice.com/blog/code-driven-creatures-occupy-this-algorithmic-menagerie)

              + More details
              • SYSTEM_INTROSPECTION [raw take @leCube]

                07:31

                from N1C0L45 M41GR3T / Added

                3,558 Plays / / 4 Comments

                SYSTEM INTROSPECTION, can be envisaged as an observation of the machine by itself, proposing a physical experience of the numeric data. The concert is based on a concrete exploration of the binary code and its intrinsic qualities (structure, logic, rhythm, redundancy, compression) immediately returned in the form of visual and sound flows. http://peripheriques.free.fr/blog/ This performance is a continuation of the following works: 2010: https://vimeo.com/18656762 2002: https://vimeo.com/38834386

                + More details
                • Two Trains - Sonification of Income Inequality on the NYC Subway

                  04:46

                  from brian foo / Added

                  33.7K Plays / / 6 Comments

                  This song emulates a ride on the New York City Subway's 2 Train through three boroughs: Brooklyn, Manhattan, and the Bronx. At any given time, the quantity and dynamics of the song's instruments correspond to the median household income of that area. Read more about the composition and process of creating this song here: https://datadrivendj.com/tracks/subway Data-Driven DJ (https://datadrivendj.com) by Brian Foo (http://brianfoo.com) is a series of music experiments that combine data, algorithms, and borrowed sounds.

                  + More details
                  • Between 0/1 2002 [3 screens version]

                    05:24

                    from N1C0L45 M41GR3T / Added

                    1,388 Plays / / 5 Comments

                    Between 0/1 is a process of unused data recuperation, a kind of numeric humus, of which he generates some resonant and visual textures. In this process of translation against nature he considers the computer garbage as an already existing audio visual potential, but unknown. (At the beginning their content are not makes nor to be seen nor to be hear.) He works then to reveal them at the time of installation and performances as resonant and visual bursts, two synchronous revelations of common information. In this sense, his work can appear in a tradition of research on the link of the picture and the sound. more info: peripheriques.free.fr/​blog/​ between 0&1 | portraits of unsued data files: flickr.com/​photos/​n1c0la5ma1gr3t/​sets/​72157626372329466/​ Live performance extract: peripheriques.free.fr/​audio/​between01_live_ecm-gantner_2003.mp3 peripheriques.free.fr/​audio/​between01_live_le10neuf_2003.mp3 New versions : vimeo.com/​18656762

                    + More details
                    • Rhapsody in Grey - Using Brain Wave Data to Convert a Seizure to Song

                      04:00

                      from brian foo / Added

                      10.9K Plays / / 8 Comments

                      This song generates a musical sequence using EEG brain wave data of an anonymous epilepsy patient. It examines the periods before, during, and after a seizure. The goal is to give the listener an empathetic and intuitive understanding of the brain’s neural activity during a seizure. Please note: I have no formal education or training with diagnosing or interpreting a seizure using EEG brain scan data. I have done my own research to the best of my abilities, but all in all, this is a purely creative endeavor and should in no way be interpreted as scientific research or be used in any context other than in this creative one. For sake of transparency, I have detailed my process for creating this song in the link below and have made all relevant code publicly accessible. Feel free to reach out to me if you notice any glaring inaccuracies. Learn more about the process of creating this song: https://datadrivendj.com/tracks/brain Data-Driven DJ (https://datadrivendj.com) by Brian Foo (http://brianfoo.com) is a series of music experiments that combine data, algorithms, and borrowed sounds.

                      + More details

                      What are Tags?

                      Tags

                      Tags are keywords that describe videos. For example, a video of your Hawaiian vacation might be tagged with "Hawaii," "beach," "surfing," and "sunburn."