This is a demonstration video of AHNE - Audio-Haptic Navigation Environment.
It is an audio-haptic user interface that allows the user to locate and manipulate sound objects in 3d space with the help of audio-haptic feedback.
The user is tracked with a Kinect sensor using the OpenNI framework and OSCeleton (github.com/Sensebloom/OSCeleton).
The user wears a glove that is embedded with sensors and a small vibration motor for the haptic feedback.
This is just the first proof-of-concept demo. More videos coming soon.
HEI Project 2011
SOPI Research Group
Aalto University School of Art and Design
Loading more stuff…
Hmm…it looks like things are taking a while to load. Try again?