1. String® Augmented Reality Teaser

    01:59

    from String® / Added

    45K Plays / / 6 Comments

    We took String®, our new Augmented Reality technology running on iPhone, out on to the streets of East London. This is the reaction of the public to their first glimpse of AR. And seeing a giant virtual sneaker in the middle of the street of course. More videos with new content coming soon.. For more information, go to http://www.poweredbystring.com CREDITS: Soundtrack: "The Daydream" by Tycho | http://www.tychomusic.com

    + More details
    • Difflect

      02:56

      from Stefan Wagner / Added

      1,972 Plays / / 4 Comments

      Difflect is a project connecting print, spatial installation and software which makes it possible to experience print media in a digitally enhanced, interactive way. The setup consists of two cameras and a display which shows a mirror-like, abstract representation of the user. The system can be fed with content by putting a printed object in front of it. It will recognize the medium being consumed by the user and display digital content related to the printed information. To introduce the system, four foldable printed objects describing the four classical elements can be chosen by the user. Each print object will show a digital representation of its element when put in front of the installation, allowing hands-on interaction to experience the characteristics and behaviour of the element. The entire project was done in openFrameworks using the OpenNI library for Kinect support. For more information, have a look here: http://andsynchrony.net/projects/difflect/ Music for the clip by Stefan Wagner

      + More details
      • Markerless Tracking Augmented Reality on a 1965 Leopard Battle Tank

        01:12

        from Bronze Software Labs / Added

        146 Plays / / 0 Comments

        We went to RAF Cosford to test out a technology we have been working on that uses markerless tracking augmented reality technology using natural feature tracking to track real objects in real time. We tested this on a 1965 Leopard battle tank using open source CAD data. The technology can be used to effectively create a 3D representation of the object, this can then be used to show and instruct complex and routine maintenance procedures like how to disassemble a part, or show a breakdown of information and service data on subsystems. The technology enables us to track the object from any angle on any scale of object. The possibilities of how to use this are endless and vital in aiding and tutoring people in real time, right in front of them.

        + More details
        • HumanTop: a multi-object tracking tabletop

          04:54

          from Labhuman / Added

          43 Plays / / 0 Comments

          In this project, a computer vision based interactive multi-touch tabletop system called HumanTop is introduced. HumanTop implements a stereo camera vision subsystem which allows not only an accurate fingertip tracking algorithm but also a precise touch-over-the-working surface detection method.

          + More details
          • Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data, ICCV 2013

            04:02

            from Srinath Sridhar / Added

            20 Plays / / 0 Comments

            ICCV 2013 Paper Title: Interactive Markerless Articulated Hand Motion Tracking Using RGB and Depth Data Authors: Srinath Sridhar, Antti Oulasvirta, Christian Theobalt MPI Informatik and Saarland University Abstract: Tracking the articulated 3D motion of the hand has important applications, for example, in human--computer interaction and teleoperation. We present a novel method that can capture a broad range of articulated hand motions at interactive rates. Our hybrid approach combines, in a voting scheme, a discriminative, part-based pose retrieval method with a generative pose estimation method based on local optimization. Color information from a multi-view RGB camera setup along with a person-specific hand model are used by the generative method to find the pose that best explains the observed images. In parallel, our discriminative pose estimation method uses fingertips detected on depth data to estimate a complete or partial pose of the hand by adopting a part-based pose retrieval strategy. This part-based strategy helps reduce the search space drastically in comparison to a global pose retrieval strategy. Quantitative results show that our method achieves state-of-the-art accuracy on challenging sequences and a near-realtime performance of 10 fps on a desktop computer.

            + More details

            What are Tags?

            Tags

            Tags are keywords that describe videos. For example, a video of your Hawaiian vacation might be tagged with "Hawaii," "beach," "surfing," and "sunburn."