1. Cascada - 2011 Mixed media: prepared steel-string acoustic guitars, fabric, motors, speakers and computer. Dimensions variables Cascada is a sound installation for prepared guitars and fans that musically interprets the sound of a waterfall and seeks to approach the vital irregularities that characterise natural events. In so doing, Cascada resizes the sound potential of an acoustic instrument and creates a sound architecture that only exists in the music of the installation. The installation works with the amplified natural sound of the guitars without any kind of audio processing. Computer controlled hanging and placed fans (dc motors) strum the strings with variable intensity. The space is filled with a mass of harmonics and chords that suggest no beginning or end. Cascada drives the listener to a contemplative state.

    # vimeo.com/36577634 Uploaded 901 Plays / / 2 Comments Watch in Couch Mode
  2. Cymatic Imprints offers people the experience of physically engaging with their sonic environment. In this multi-sensory installation, live sound is transposed to the range of 5 - 20Hz by a Pure Data pitch-shifter customized with the assistance of Roman Haefeli. This transformation generates visual and sonic variations through the reaction of extensions as they make contact with the floor. The resulting sounds mingle with those of viewers and peripheral ambient noise to create a causal loop of co-responsive activity with limitless variation. In this way, the work renders the immateriality of sound as a tangible event by silencing and reinterpreting the data of everyday life.

    # vimeo.com/36579543 Uploaded 176 Plays / / 0 Comments Watch in Couch Mode
  3. In this paper we present the new plugin infrastructure in the upcoming release of Gem, that aims to cleanse the core library of superfluous dependencies.

    Traditionally, Gem is built as a monolithic library of Pd objects, with linked in support for various system features like image acquisition or output methods.

    This has been redesigned, in order to allow to extend Gem's interfacing capabilities in a plugin based way, allowing developers to easily extend the system with new backends. On the other hand, end users need not install the full feature set, but can choose what they need and eventually build a minimal system.

    Plugins do not extend Gem's set of objectclasses, but rather provide “backends” to one (or several) given objectclasses. The motivation for externalizing these parts of Gem into a plugin system, was mainly driven by two aspects: lowering dependencies and providing a uniform interface across platforms.

    Lowering dependencies aims at easing installation of binary packages (and thus the maintenance of such packages). For instance, on linux Gem (0.92) can be compiled with support for five different movie reading libraries, some of them being (partially) patent encumbered, outdated or otherwise hard to obtain. In order to support the widest range of film footage, one is tempted to link against all possible libraries. However, this also means that the end user has to install these libraries first in order to make use of Gem, because the Pd (or rather: the operating system) will refuse to load Gem if one of those libraries is missing. Even if they are not interested in video playback at all! (The alternative to making the end user install a number of libraries, is to provide them in the release, which bloats the package, eventually introducing legal problems)

    From the patching side of things, the new plugin infrastructure provides a uniform interface to device/backend dependent settings. Different backends provide different possibilities to interact with e.g. an image acquisition device. What's worse, different devices can have different features, the user might want to control. (E.g. a webcam could provide a means to control pan/tilt/zoom, whereas a video capture card might allow to switch between different inputs). In the past this divergence has led to incompatible implementations of e.g. the [pix_video] object, leading to patches that are not portable across operating systems.

    The plugin infrastructures provides a way to query and set virtually all available controls for a given device/backend in a uniform way (though the available controls will obviously vary), and to keep controls persistens when switching between backends.

    So far, [pix_video] (for live video acquisition), [pix_film] (for film footage acquisition) and [pix_record] (for video output), have been switched.

    uni-weimar.de/medien/wiki/PDCON:Conference/Plug_your_cam_-_extending_Gem_the_modular_way

    # vimeo.com/36598445 Uploaded 178 Plays / / 0 Comments Watch in Couch Mode
  4. We present an implementation of the Dynamic Time Warping algorithm for the Pure Data programming environment. This algorithm is fairly popular in several contexts, ranging from speech processing to pattern detection, mainly because it allows to compare and recognize data sets that may vary non-linearly in time. Our contribution is easily portable to a wide number of platforms, where Pure Data is available. Throughout this document we describe relevant work that inspired our proposal and present the core concepts of our implementation.

    uni-weimar.de/medien/wiki/PDCON:Conference/Dynamic_Time_Warping_for_Pure_Data

    # vimeo.com/36582311 Uploaded 134 Plays / / 0 Comments Watch in Couch Mode
  5. Music for Flesh II is a seamless mediation between human biophysical potential and algorithmic composition.
    By enabling a computer to sense and interact with the muscular potential of human tissues, the work approaches the biological body as a means for computational artistry.
    Muscle movements and blood flow produce subcutaneous mechanical oscillations, which are nothing but low frequency sound waves. Two microphone sensors capture the sonic matter created by my limbs and send it to a computer. This develops an understanding of my kinetic behaviour by *listening* to the friction of my flesh.
    Specific gesture, force levels and patterns are identified in real time by the computer; then, according to this information, it manipulates algorithmically the sound of my flesh and diffuse it through a quadraphonic system. The neural and biological signals that drive my actions become analogous expressive matter, for they emerge as a tangible sound.

    uni-weimar.de/medien/wiki/PDCON:Concerts/Marco_Donnarumma

    # vimeo.com/36580607 Uploaded 526 Plays / / 0 Comments Watch in Couch Mode

Follow

Pure Data Convention 2011

PdCon11

Browse This Channel

Shout Box

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels. Channels