1. Webinar: Smart 3.0 Automated Evaluation of Behavior - June18 2013

    59:10

    from Harvard Apparatus / Added

    104 Plays / / 0 Comments

    Kai Bunk, the developer of SMART Video Tracking Software, presents this 45-minute webinar to show you how SMART V3.0 is both simple and powerful! Discover how Smart 3.0 : • Offers maximum flexibility by easily adapting to acquire data from different types of environments—single animal, multiple animals, any arena, multiple arenas—not a problem. • Configures easily with a modular structure to meet your needs—both immediate and long term • Differentiates the head, center-mass, and tail-base, with TriWise technology making the system ideal for detecting stretching, rearing, exploratory and rotational behavior. • Acquires and analyzes in real time–allowing you to visualize experimental data as it happens. • Scripts an experiment’s evolution with an integrated database and experimental scheduler. Kai will walk us through this, and more, in this personalized tour of SMART V3.0—see it for yourself.

    + More details
    • Webinar - SMART 3.0 - Easy Video Tracking pour l'Evaluation Automatisée du Comportement

      57:58

      from Harvard Apparatus / Added

      70 Plays / / 0 Comments

      Evelyne Célèrier, représentante technico/scientifique de l'équipe responsable du développement du video-tracking SMART à Panlab/Harvard Apparatus, présentera ce séminaire on-line de 45 minutes pour illustrer à quel point SMART V3.0 est à la fois facile d'utiliser et puissant! Découvrez comment SMART 3.0 : • Est capable d'offrir une flexibilité maximale par ses capacités d'ajustements à un grand nombre de conditions expérimentales : un animal, multiple animaux, une ou plusieurs enceintes expérimentales, peu importe l'enceinte - ce n'est pas un problème! • De par sa structure modulaire, peut répondre facilement à vos besoins les plus immédiats - ou futurs. • La technologie Triwise utilisée pour la différentiation automatique des 3 points (tête/centre de masse/base-de -la-queue) fait de ce système un outil idéal pour la détection des comportements d'exploration, de l'activité vertical (« rearing ») ou des rotations de l'animal. • Permet d'enregistrer les données et de les analyser en temps réel afin de pouvoir suivre l'évolution des résultats de votre expérience en même temps qu'elle se déroule. • Comment la base de données des sujets expérimentaux et le planificateur intégrés apportent une traçabilité maximum des données expérimentales. Evelyne vous parlera de cela, et de bien d'autres caractéristiques du software, dans ce tour personnalisé de SMART V3.0 - venez voir par vous-même!

      + More details
      • Julian Stein: Tracking, Realtime Video, Lighting Animation, AR Workshop, part 1

        49:52

        from Alkemie Atelier / Added

        39 Plays / / 0 Comments

        Pervasive Play Workshop, Logan Art Center, Chicago University. Sha Xin Wei & Julian Stein (Alkemie, TML), Patrick Jagoda (English). Jan 30 - Feb 2, 2013.

        + More details
        • Julian Stein: Tracking, Realtime Video, Lighting Animation, AR Workshop, part 2

          31:06

          from Alkemie Atelier / Added

          19 Plays / / 0 Comments

          Pervasive Play Workshop, Logan Art Center, Chicago University. Sha Xin Wei & Julian Stein (Alkemie, TML), Patrick Jagoda (English). Jan 30 - Feb 2, 2013.

          + More details
          • Gestation

            20:15

            from Garth Paine / Added

            88 Plays / / 1 Comment

            Interactivity has become a major consideration in the development of a contemporary art practice that engages with the proliferation of computer based technologies. Computer based technologies have created a revolution in the fields of animation and image generation as well as sound art and music composition. The computer has opened up a whole new genre where primary composition material can be drawn from any source, and once digitised, becomes a fluid and viscous medium. Garth Paine’s interest lies in placing the exploration of the potential of these technologies within an organic and human framework. His installation work has focused on creating immersive environments that respond to the movement and behaviour patterns detected within them. The body becomes the controller. The organic process of human exploration, cognition and response, becomes the central influence in defining the output of the interactive process. Gestation represents a development in the responsive environment works of Garth Paine. Following on from Moments of a Quiet Mind (Linden Gallery), Ghost in the Machine (Linden Gallery), MAP1 (Next Wave Festival, Span Gallery), MAP2 (SIM Berlin). Gestation has been in was a major focus of Garth Paine’s work during his Australia council for the Arts, New Media Arts Fellowship at RMIT in 2000. Gestation is an interactive responsive environment first exhibited at RMIT Gallery, Melbourne, December 2000. It occupied two integrated galleries. One gallery contained a surround sound field generated in real time using video sensing equipment (visible to visitors only as a small security video camera in the middle of the roof) that maps the behaviour and movement patterns of the visitors to the exhibition on to real-time audio algorithms providing a tight gestural relationship with their movement and behaviour patterns. No pre-recorded material is used in the generation of the sounds. In the second gallery, a large projected image represents the development of new human life in response to the activity in the first gallery. The imagery represents a sea of life forming cells. An added layer to the underlying sea is the development of new foetuses. Each foetus starts to grow at the point at which the greatest activity is sensed in the first gallery. This work has also been exhibited in Florida, New York, Dublin, Leicester, and Australia

            + More details
            • How to track an object that goes off screen with Adobe After Effects

              14:11

              from Kert Gartner / Added

              1,807 Plays / / 3 Comments

              http://vfxhaiku.com If it leaves your sight | Disappeared, it is not | Lock it to your plate In this tutorial, we take a look at how to track an object that goes off screen with Adobe After Effects. http://vfxhaiku.com/2011/12/how-to-track-an-object-that-goes-off-screen-with-adobe-after-effects/

              + More details
              • seine hohle Form

                12:34

                from Butch Rovan / Added

                280 Plays / / 1 Comment

                ...seine hohle Form... for interactive computer music, video tracking system and dance music & interactive programming: Butch Rovan video tracking system: Frieder Weiss choreography: Robert Wechsler dance: Robert Wechsler / Laura Warren "...seine hohle Form..." is a fragment from the prose poem "Gesichter" by Rainer Maria Rilke, roughly translating to "its hollow form." As a starting point for this interactive work, it serves as an emblem for the interesting challenge of creating a musical work that only exists when a dancer moves. Using real-time synthesis and video tracking technology, the choreography is affected by the live generation of sound through sensors and computer systems, and the music is in turn shaped by these movements. From a musical perspective, the challenge was to create a "hollow form" that would be activated by the physical gestures of the dancers. An important part of this project was dealing with the gestural mappings—dance gesture to synthesis—such that the music and dance could fluctuate between varying levels of independence. At points in the piece the synthesis mappings are very tied to the dance gestures, at other times the musical system behaves somewhat independently. In these independent moments, however, the system can be brought back into control by particular actions of the dancers. In the end, the goal in writing this piece was to create a work in which the music was not merely tied to the gestures of the dancers—which can be the challenge when working with interactive music/dance systems—but instead, create a work where the musical and dance components exist in a dynamic dialectic that allows for polyphony of sound and gesture.

                + More details
                • 33 1/3 - Music for Turntables, Cylindrical Drawings and Video Tracking System

                  10:30

                  from Be Johnny / Added

                  This (nearly retro) performance is from Johnny’s early career at iEAR studios, Rensselaer. Composed and performed in 1996, 33 1/3 involved drawing a series of scores onto paper strips wrapped onto cylinders. The drawings were spun on a standard turntable and tracked with video analysis, converting the drawn gestures into MIDI data to create sound. Johnny also mixes found audio collage and effects on a 2nd turntable. This is one of the earliest known examples of using live video tracking software to create music. The software used was STEIM’s Big Eye (shortly after its release) running on a PowerPC 7200 AV. Other gear used: Kurzweil K2000, 2 Technics turntables, Mackie 1202 Audio Mixer, Korg Modular effects pedals, ink pens and of course paper. Video documentation was shot at an MFA performance at Bennington College, mixed live on a WJMX-30 by John J.A. Jannone. (projected on stage during the performance) Originally recorded on 3/4" SP Tape, later digitized to miniDV, now remastered for 720p.

                  + More details
                  • Aleph - documentary

                    09:22

                    from Smith / Added

                    222 Plays / / 4 Comments

                    This is a video documentation of an installation that I did at the Croft in Bristol in Feb 2009 - it was the major project for my MA in creative music technology. Update: the final write up of the project with all the files etc is available at smithaudio.wordpress.com/​

                    + More details
                    • HUMAN TRACKING

                      09:14

                      from Martin Hug / Added

                      29 Plays / / 0 Comments

                      Part 3. Presentation of the residency at GRANER, CENTRE FOR CREATION OF THE BODY AND MOVEMENT Barcelona 22 December 2013 Juschka Weigel dance Martin Hug concept & programming A vigilant camera (ip-camera) observes the movements of a dancer and its data is analyzed with RoboRealm, a machine vision program environment. From there data is sent with OSC to SuperCollider where correlated sound is produced. The residency was from 08th to 22nd of December 2013 at GRANER in Barcelona.

                      + More details

                      What are Tags?

                      Tags

                      Tags are keywords that describe videos. For example, a video of your Hawaiian vacation might be tagged with "Hawaii," "beach," "surfing," and "sunburn."