SN: I wanted to get at the closest relationship possible between my hands and the resulting sound. Having worked with sampling and complex processing and various sensors such as EMG, motion capture with live sound as the source seemed a way to really get inside an improvisation system that was really live and really intuitive. You can judge for yourselves!,

NG: Sarah's movements are sensed using a Kinect 3D motion capture device and the gestures are recognised in real-time using the SEC, a machine learning toolbox that has been specifically developed for musician-computer interaction.

=== About the performers:

Sarah Nicolls UK-based experimental pianist and inventor of `Inside-out piano'; collaborative researcher with e.g. Atau Tanaka, PA Tremblay; concerts e.g. world premieres of Larry Goves' Piano Concerto, Richard Barrett's Mesopotamia/London Sinfonietta/BBC Radio; article in LMJ20; Senior Lecturer at Brunel University; funding: Arts and Humanities Research Council (AHRC), Brunel Research Initiative and Enterprise Fund (BRIEF), Arts Council England.

Nick Gillian Post-doctoral researcher currently working on an E.U. project entitled SIEMPRE at the Sonic Arts Research Centre, Belfast. Nick recently completed his PhD in Gesture Recognition for Musician-Computer Interaction under the supervision of R. Benjamin Knapp and Sile O'Modhrain. His interests are in machine learning and pattern recognition and applying these techniques to enable real-time musician-computer interaction.

=== Recorded at:

11th International Conference on New Interfaces for Musical Expression. 30 May - 1 June 2011, Oslo, Norway.

nime2011.org

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…