1. Merlin's Mustache_DIY CHARM - The modern way of receiving Matsu’s blessing

    02:34

    from Merlin's Mustache / Added

    116 Plays / / 0 Comments

    The Dajia Matsu Pilgrimage is the most important religious event in Taiwan, and is recognized by Discovery Channel as the world’s top three religious events. Every print, Matsu visits all of her temples, and brings worshippers blessing. It is the communication between god, and also between people. T-Cat Delivery Service has joined Daji Matsu Pilgrimage for 12 years, providing worshippers shower and free luggage delivery service along the way. In 2015, Merlin’s Mustache Lab joins T-Cat in the Pilgrimage by hosting an interactive DIY CHARM at T-Cat service station. The new experience of praying through digital interaction enables worshippers to customize their own charm. The worshippers write down their prayers in the air, enabling by hand tracking. The handwriting is turned into graphic, saved in the RFID in the charm, and the graphic of prayer is embroidered onto the charm. When the worshippers receive their customized charm, they activate it on digital censer, and watch their wish glorified powered by hologram. The worshippers bring the activated charm with them everywhere with Matsu’s blessing. 1. Writing prayers using hand tracking 2. Prayer is digitally embroidered on site 3. Charm with customized RFID tag to view prayers 4. Tailor-made charm 5. Watching prayers activated in hologram 6. The top wish - safety

    + More details
    • Mixed Reality Handtracking Game

      00:36

      from Iris Seidinger / Added

      11 Plays / / 0 Comments

      A mixed reality game where the player has to find out which angler catches the fish and tap/knock on the according side of the tablet. This project consists out of two applications: - a PC application that tracks the player's hand and the device - an android application that requests the hand position from the PC application, if it recognizes noises and shakes This project was part of an assignment for Mixed Reality Interaction in the postgraduate programme Multimedia Technology at the University of Applied Sciences Salzburg.

      + More details
      • Gesture Controlled Battlehip Game with Haptic Feedback (DT2140)

        00:39

        from Guy Incognito / Added

        For the course DT2140, we created a simple battleship prototype using a Leap Motion controller for gesture controls and a DIY haptic glove for haptic feedback. The glove vibrates whenever the player scores a hit. We implemented two game modes. The one you see in this video is the classic turn based mode. In the second one, the haptic glove vibrates in different intensities depending how close the cursor is to an enemy ship. That way, the player can "feel out" the location of the enemy ships, similar to a sonar. Since the game would be too easy this way, the computer drops bombs in certain intervals instead of waiting for its turn.

        + More details
        • Wildfang - Kinect Handtracking Game

          01:18

          from LauRa / Added

          19 Plays / / 0 Comments

          This is a little prototype of an interaction game, created during the 3rd semester of my design studies. The video shows the interaction between player and animals. Actually the mouse in this video is the player's hand position, tracked with a Microsoft Kinect.

          + More details
          • AR Musical Fingers Dance Test

            04:02

            from Jeremy Bailey / Added

            337 Plays / / 0 Comments

            audio reactive AR fingers test (using Leap Motion V2 w Processing + Max)

            + More details
            • George Batchelor Showreel April 2014

              01:56

              from George batchelor / Added

              113 Plays / / 0 Comments

              My April reel after finishing another project and working some more on my major project. Breakdown: 0:06 The Iron Man Game Final Major Project This is an individual project that I'm still working on. The first two levels are in the video, which demonstrates some of the gameplay at the moment and I'm getting more of the character switching element of the game in. 0:38 Space Face C++ programming project in a group of five I personally did the Kinect hand tracking, the UI and general game flow, the collision detection, the audio and integration. More on it here: http://georgebatchelor.com/2014/04/19/space-face/ 0:59 Procedural Solar System Innovations Project This program generates solar systems. The solar systems consists of a star being orbited by up to five planets, the atmosphere of each being dependent on it's distance from the star and mass, both of which are generated with weighted random facts, using data from real exoplanet research to influence the random weight. The end result is a program in which the user can generate different systems infinite times and explore the scene. This was made with Unity and programmed in C#. More can be read about it here: http://georgebatchelor.com/2014/03/07/how-do-you-organise-a-party-in-space/ 1:20 AI Traffic System EA Ghost Games Masterclass Project This project was set by EA Ghost Games and the task was to create a scalable believable traffic system in Maya using python script which demonstrated a certain number of behaviours. The behaviour are listed in the video. See this post for more info: http://georgebatchelor.com/2014/03/06/traffic-ai-finished/ 1:29 Giraffic Park Programming Project This project was written using QT in C++. The challenges in this project came with using a very low level framework and having to load model view matrices to the shader and physically draw everything in the correct way. The artwork for the facts in the game are the result of a collaborative effort from family and friends. Read this for more information: http://georgebatchelor.com/2013/03/30/giraffic-park/ The music is Stay Lit by Holy Fuck.

              + More details
              • Spotlight

                00:35

                from Bravo Media Inc / Added

                41 Plays / / 0 Comments

                Demo of our transparent display box in conjunction with wireless hand tracking technology, to create a compelling interactive experience.

                + More details
                • Head of the Order : teaser trailer

                  00:51

                  from Unicorn Forest Games / Added

                  318 Plays / / 0 Comments

                  Head of the Order is a gesture based spell casting game developed by the two person team, Jacob & Melissa Pennock. Together they make up the tiny indie studio Unicorn Forest Games. The tech demo for this game recently won Best Game in the Intel Perceptual Computing Challenge. We are currently working towards a full game release.

                  + More details
                  • >abunchofexperiments> Point and Shout

                    00:41

                    from rux / Added

                    Quick sketch exploring Onomatopoeias for expressive Voice Recognition systems together with Hand and Finger tracking as a base for playful interactions. made at the NYC Perceptual Hackathon using: Intel’s Perceptual Computing SDK + OpenFrameworks www.rux-werx-here.net

                    + More details
                    • Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data, ICCV 2013

                      04:02

                      from Srinath Sridhar / Added

                      20 Plays / / 0 Comments

                      ICCV 2013 Paper Title: Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data Authors: Srinath Sridhar, Antti Oulasvirta, Christian Theobalt MPI Informatik and Saarland University Abstract: Tracking the articulated 3D motion of the hand has important applications, for example, in human--computer interaction and teleoperation. We present a novel method that can capture a broad range of articulated hand motions at interactive rates. Our hybrid approach combines, in a voting scheme, a discriminative, part-based pose retrieval method with a generative pose estimation method based on local optimization. Color information from a multi-view RGB camera setup along with a person-specific hand model are used by the generative method to find the pose that best explains the observed images. In parallel, our discriminative pose estimation method uses fingertips detected on depth data to estimate a complete or partial pose of the hand by adopting a part-based pose retrieval strategy. This part-based strategy helps reduce the search space drastically in comparison to a global pose retrieval strategy. Quantitative results show that our method achieves state-of-the-art accuracy on challenging sequences and a near-realtime performance of 10 fps on a desktop computer.

                      + More details

                      What are Tags?

                      Tags

                      Tags are keywords that describe videos. For example, a video of your Hawaiian vacation might be tagged with "Hawaii," "beach," "surfing," and "sunburn."