Integra Lab

  1. esthesis explores the confluence of physical gesture, sound and visuals. The user generates a click, which is then captured by esthesis. Through physical gesture the initial impulse is sonically manipulated and visualised, allowing the user to extrapolate the sound and explore the synergetic relationship between their movements, the sounds heard and the visuals seen.

    esthesis was funded by the Arts and Humanities Research Council as part of the Transforming Transformation project by Birmingham Conservatoire’s Integra Lab. For more information about the project see: integra.io/transforming-transformation/

    Please use headphones to listen to this!

    # vimeo.com/168771286 Uploaded 114 Plays 0 Comments
  2. This is the fourth experiment in a 1-year AHRC Digital Transformations project run jointly between Birmingham Conservatoire’s Integra Lab and Glasgow School of Art’s Digital Design Studios. The project explores new ways of positioning sound in space using touch-free interaction.
    In this experiment we use the Microsoft Kinect as a motion capture device, enabling sounds can be dragged from a palette into a virtual 3D environment and moved around using direct physical manipulation.
    Additionally, the system provides the capability to draw spatial trajectories with a simple finger movement and to automate the movement of sounds along the trajectories.
    This initial experiment was developed in Unity 3D using the 3Dception plugin by TwoBigEars for binaural positioning.

    See the project website for further info: http://integra.io/transforming-transformation

    # vimeo.com/152990816 Uploaded 205 Plays 0 Comments
  3. # vimeo.com/152834125 Uploaded 846 Plays 0 Comments
  4. First test of the upcoming Eurorack modules for Integra Live.

    # vimeo.com/146121551 Uploaded 381 Plays 0 Comments
  5. A brief exploration of Pose and Gesture Recognition using Kinect 2 skeleton tracking and various Machine Learning Techniques in Max MSP Jitter.

    UPDATE: I think it was around 4am when I recorded this video. In retrospect I realize I was waffling quite a bit.

    src at github.com/memo/max-skeletonML

    using:

    dp.kinect2 - hidale.com/shop/dp-kinect2/
    Max external to interface with the Kinect2 using the official MS Kinect SDK v2

    Gesture Recognition Toolkit (GRT) - nickgillian.com/software/grt
    C++ library for machine learning with focus on gesture recognition

    ml-lib - github.com/cmuartfab/ml-lib
    Max wrappers for the GRT

    # vimeo.com/122166652 Uploaded 3,559 Plays 3 Comments

Integra Lab

Jamie Bullock

A channel showcasing R & D work from the Integra Lab at Birmingham Conservatoire

Browse This Channel

Shout Box

Heads up: the shoutbox will be retiring soon. It’s tired of working, and can’t wait to relax. You can still send a message to the channel owner, though!

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels.