1. This paper presents initial Pure Data abstractions as a part of a toolkit system for the implementation of PESI research project. Research focuses on mobile interfaces in participatory interactive art context. Designing an easy to use/control interface for an mobile communication tool, allows participants to become more familiar with the collaboration process and experience a way of making music with a mobile device. A wide range of complexity of control-mapping layers and their integration in such a system, makes the design very challenging process. The implementations of control layer Pure Data abstractions in Maemo Nokia N900 device are described in this paper.

    uni-weimar.de/medien/wiki/PDCON:Conference/An_Exploration_on_Mobile_Interfaces_with_Adaptive_Mapping_Strategies_in_Pure_Data

    # vimeo.com/36979690 Uploaded 43 Plays / / 0 Comments Watch in Couch Mode
  2. This paper presents a collection of Pure Data abstractions for real-time transcription of the audio signal produced by musical instruments. It is a tool

    1. for musical notes labeling and targeting in time;
    2. for handling the representation of musical data obtained in 1).

    It is part of the PDescriptors, a library for audio features extracting developed by the author, mostly based on BSP technique[1]. In the text we start introducing literature review on automatic music transcription and its central issues. it follows the models parametric transcription we adopted in the present research regarding onset detection, extraction of harmonic content and timbre classification of percussion instruments. Finally, we discuss and detail our Pure Data implementation and present some results related its aplications in a human-computer interaction system.

    uni-weimar.de/medien/wiki/PDCON:Conference/A_Framework_for_Real-time_Instrumental_Sound_Segmentation_and_Labeling

    References
    1. ↑ Barknecht, 2010

    # vimeo.com/36979356 Uploaded 36 Plays / / 0 Comments Watch in Couch Mode
  3. rj is an open source library of Pure Data abstractions that was developed as part of the RjDj project to support developing reactive music scenes for mobile devices like the iPhone running a variant of Pure Data inside the RjDj player software. As a least common denominator for such environments the object class vocabulary of the core “vanilla” version of Pd is used. This way, the rj library is extremly portable between different distributions of Pd and can be used on many mobile devices with limited hardware, but it is also useful as a general musicians toolbox on common computers.

    The rj library strives to provide a minimal, but also fairly complete set of useful tools for musicians including composition helpers, sound generators, input handlers or sound effects. It can be used as a globally installed library but it is also designed to work by being copied into a project folder, thereby making the project or “scene” self-contained and easily distributable.

    The rj library has been successfully used by dozens of musicians writing scenes for the RjDj platform or in the the popular iPhone app for the movie Inception.

    uni-weimar.de/medien/wiki/PDCON:Conference/rj_-_abstractions_for_getting_things_done

    # vimeo.com/36979591 Uploaded 100 Plays / / 0 Comments Watch in Couch Mode
  4. Beatjazz is what you get when you cross a sci-fi obsessed electronic jazz artist with a futurologist instrument inventor. the result of 20 years of conceptualization that has taken Onyx Ashanti on a conceptual journey from mississippi to California to London to NYC and finally to Berlin, where he now resides.

    Beatjazz is a completely live, improvised form of electronic music with no boundaries. Onyx creates all aspects of his self-created style, live, from scratch...nothing pre-recorded or pre- concieved. he plays each part, one part at a time and builds one-of-a-kind sonic-architecture. So, to aid in this concept, he also created a one-of-a-kind musical instrument to play it with. he calls it a beatjazz controller. with it, every motion, every breath is creating sound which he transforms into space-age funky grooves. Onyx sounds like the offspring of Roger Troutman, Timbaland, and John Coltrane. having cut his teeth playing the southern california rave scene in the late 90's, he spent the first half of the the 2k's playing and recording with the likes of Soul II Soul, Basement Jaxx and Marshall Jefferson. the last 3 years has seen him take on the challenge of creating a new form of music for a new age. a form un-encumbered by genre. armed with a computer full of the best software synthesizers, Onyx can construct any style of music imaginable and quite a few that arent.

    The beatjazz controller is really a three-way wireless sensor network that converts his finger, hand, and breath movements, into musical control data, which allows him to play synthesizers with dance-like movements and saxophone-like fingerings. In addition, Onyx created a light based “element narration” subsystem that relates sound types to colors so the audience has a better idea of which part of the arrangment that Onyx is playing at any given time. the system was designed specifically for beatjazz and the combination of the two together create a never before realized marriage of sound, light and motion. Having presented beatjazz on a world stage by way of his acclaimed TED presentation performance, Onyx has now begun his new sonic journey in earnest and plans nothing less than to change the entire infrastructure of what the world calls music and to inject a much needed “live-ness” to electronic music culture by not only performing beatjazz, but also sharing his concept with the world as an open source project-controller plans, as well as musical concepts-, so that beatjazz may virally find its way into every aspect of modern music. Beatjazz is natural progression and result of the atomization of music culture.

    uni-weimar.de/medien/wiki/PDCON:Concerts/Onyx_Ashanti

    # vimeo.com/36971382 Uploaded 169 Plays / / 0 Comments Watch in Couch Mode
  5. Composing music in Pure Data, or in any other similar programming languages, conventionally involves the use of a (discrete) messaging system for ryhthmic sequencing and audio rate signals for sound synthesis. While such a combination serves its purpose well and has its merits, the aim of this paper will attempt to outline an alternative approach – using audio signals only for creating both time related events and sound synthesis – and discuss various interesting factors which arise.

    # vimeo.com/36747233 Uploaded 246 Plays / / 0 Comments Watch in Couch Mode

Pure Data Convention 2011

PdCon11

Browse This Channel

Shout Box

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels. Channels