from Anita_zk Added 388 3 0

    http://anita-zk.com/#2139496/DOUBLE-TAKE Double Take is an exploration of the doubling of perception. The extreme precision of the robots movement allows a rigorous examination of the act of perceiving, space, sound and movement in an immersive environment. Double Take maps a virtual cinematic world onto the physical space of the robot lab and casts the robots as light houses, revealing this superimposed reality. A projector mounted to the robot arm sweeps through the space illuminating this virtual world. By morphing and distorting the virtual environment Double Take expands the perception of the space by mapping alternate geometries and multiple environments onto the “real” space of the robot lab. A robots’ eye view live feed video provides a further doubling of or perception within the robots sphere of action. The sound scape uses the movement of the robots as input for a interactive score. A Real Time data flow from the positions of the robots joints, generates sound according to the relationship between them. Sound frequencies increase as the robots become closer to each other, and the modulation varies depending on the speed of the robot. Sciarc 2011 Course Robotic Confections & Confabulations Instructor Devyn Weiser AT Jonathon Stahl Design Team: Matthew Au, Curime Batliner, Ana E Herruzo-Pierce, Brandon Kruysman, Jonathan Proto, Chris Skeens Software and programming: Max/Msp/Jitter esperant.0 Phyton Maya Val3

    + More details
    • Domesticated_Animalia_Live featuring Esther Lamneck


      from Izzi Ramkissoon Added 346 1 0

      Domesticated Animalia Composer - Izzi Ramkissoon Date of Composition - April 2009 Title - Domesticated Animalia Duration - 8:25 mins Instrumentation Clarinet- Esther Lamneck Electronics- Izzi Ramkissoon Domesticated Animalia:

 The development of this composition came out of the recent stories in the news media. In the pursuit to understand the case studies of captivated animals overlapped with one’s place in a media culture formed this piece into a tiered system of relationships, animal to human, human to society, society to media. Each tier delves deeper into a more abstract idea of humans struggling with established systems of a media driven society. The basic instincts of survival are addressed as the sonic environment confronts each performers present state in the piece. The technological processing that occurs on the live instruments facilitates communication between the animals, humans, and media. The processing, in essence, is an extension of the animal and human decision, belief and value systems when interacting with the autonomous technology. The electronics and samples provide layers of sound allowing a more primal approach, for the traditional instruments, to the musical inflections of rhythm and pitch in the piece.

 Many thanks to: Esther Lamneck Cort Lippe R. Luke Dubois

      + More details
      • RyO


        from Florian Grassl Added

        RyO "Reclaim your opera" is an interactive, generative a/v work, set in the contexts of experimental theater and performance art. Here are a few clips from the debut performance that took place on the 6th an 7th of April 2014 at Mojo Club Hamburg. Reclaim your opera is a piece initiated and conceptualized by Max Maintz and developed and performed by Max Maintz, Florian Grassl and Tito Loria.

        + More details
        • IRL (by Bryan Cera)


          from bryan cera Added 336 6 0

          IRL (In Real Life) - Networked Robotics Experiment.

          + More details
          • MaxMSP / Jitter - Head tracking + QuicktimeVR


            from Gabriele Carù Added

            A software that give you the possibility to control a QuicktimeVR file with the movement of the head. Author: Gabriele Carù www.gabrielecaru.com

            + More details
            • Fusiform Polyphony by Ken Rinaldo


              from Ken Rinaldo Added 286 0 0

              Description of the project Fusiform Polyphony is a series of 6 interactive robotic sculptures that compose their own music with input from participant facial images. Micro video cameras mounted on the ends of these robots, move toward people’s body heat and faces while capturing human snapshots. These images are digitally processed, pixelated and produce constantly evolving generative soundscape, where facial features and interaction are turned into sound melody, tone and rhythm. These elements fused, manifest the viewer as participant / actor and conductor in defining new ways of interacting with robots and allow the robots to safely interact with humans in complex and natural environments. An important element of this installation is to see self, through the robots artificial eyes, as each robot tracks and captures images in the process of showing the nature of algorithmic robotic vision. These works are covered in human hair and explore new morphologies of soft robotics, an emerging field, where natural materials make the works approachable and friendly. The hair serves to point to a human robotic hybrid moment in our own evolution, where the intelligence of robots is more fully fusing with our own, in allowing new forms of robotic augmentation. Each robot has different colored hair creating individual character for each. The live camera based video of the robots is processed through MAX MSP and Jitter and projected to the periphery of the installation on 5 screens. When the robot is at head height a sensor at the tip of the robot is triggered and a facial snapshot is taken. This snapshot is held in the small area of the projected screen to the upper right. That snapshot is broken down into a 300 - pixel grid and the variations of red, green and blue data of each pixel is extracted and interfaced to Max MSP to Ableton Live a sound composition tool which selects the musical sample determining rhythm, tempo and dynamics. The robotic aspects of this work are controlled with 6 Mac Minis with solid-state drives wired to individual, midi-based controller to sensor and motor drive units. The Mac Minis are all networked to a Mac Pro Tower which processes the video of the 6-selected images, interfacing them to the Ableton Live sound program. Changing pixel data constantly changes Ableton virtual instrument selection sets with random seeds coming from the snapshots. The robotic structures are were created with 3D modeled cast urethane plastics, monofilament and carbon fiber rod and laser cut aluminum elements supporting the computers microprocessor and motor drive systems. These robots structure, inform, enhance and magnify, people’s behavior and interactions as they auto generate a unique and a constantly evolving generative soundscape. They take the unique multicultural makeup of each person and create “facial songs” where those songs joining with 6 robotic / human soundscapes, creates an overall human polyphonic and video experience.

              + More details
              • Vonome - art21 Tutorial


                from amjc Added 262 2 0

                "Vonome" version 1.0 by Alejandro Miguel Justino Crawford 2009 programming by R. Luke Dubois & AMJC http://blog.art21.org/2011/09/22/how-to-play-a-vonome/

                + More details
                • Um die Ecke – by Denis Handschin and Michel Winterberg


                  from Michel Winterberg Added 255 4 0

                  Um die Ecke (Around the Corner) In cooperation with Denis Handschin Mixed Media, 2012 Object size: 26 x 80 x 32 cm Collection Haus of Electronic Arts Basel, Switzerland, 2013 Exhibitions: - WRO 15th Media Art Biennale in Wrocław, Poland, 9. May - 16. June 2013 - House of Electronic Arts Basel, Switzerland, Regionale 13 exhibition Hidden/Obvious, 25. November 2012 - 6. January 2013 In this work, the Dreispitz site – an industrial site in Basel, Switzerland, which is currently being transformed into a lively urban quarter – was analyzed in terms of specific characteristics regarding acoustics, visuality and electromagnetic fields, with a focus on the immediate, intuitive snapshot and subjective mode of perception. A site-specific pyramidal model based on the actual proportions of the area and undercoated with bone meal is used to visualize the analyzed data. The height of the model derives from the golden ratio of the total area and culminates in a peak that relates to the geometric center of the base area. The pyramidal model is used as projection surface for the following media: photography and video material of the site, computed tomography images of stored findings from the Archaeological Soil Research Agency Basel, as well as live data such as meteorological information and site-specific webcam images that are streamed in real-time from the Internet. From a distance, the object seems to glow in black and white; it reveals its color and acoustics only on closer inspection. "Um die Ecke” – Around the Corner – moves between the boarders of surveillance and transcendence, and engages in the human need for prognostic glances at the future. It deals with predictions about the future development of human beings, and the clash of the apparent and hidden. In a blend of diverging approaches, and as a reflection of common interests and the hustle of urban life, the senses become overstrung, and visualized messages are rendered invisible again. The displayed data are in constant flux, depicting ephemeral and uncontrollable snapshots from the perspective of a viewer. The object becomes an oracle and a kaleidoscope of the past, present and future.

                  + More details
                  • Moviestar v2 Work in progress - Premiere METEOR Festival 2013


                    from marieke v Added 252 2 0

                    Moviestar V2 in progress - premiering Thursday 17th of October at METEOR Festival 2013 Bergen , Norway http://moviestar.marieke.nu

                    + More details
                    • playing with webcam and Korg nanoPad in Max/Msp/Jitter


                      from Je Seok Koo Added 238 1 0

                      This is a simple test of video processing in Max. Korg's nanoPad just triggers several video effects and sound samples in Max through Midi.

                      + More details

                      What are Tags?


                      Tags are keywords that describe videos. For example, a video of your Hawaiian vacation might be tagged with "Hawaii," "beach," "surfing," and "sunburn."