We present the first integrated application of medical ultrasound imaging to remotely control a virtual hand able to play piano in real-time. Detecting human finger motions and forces plays an important role in teleoperation and virtual reality. Standard data gloves or optical finger tracking devices can provide reliable finger movement data, but they need to tackle elasticity and occlusion issues, respectively; additionally, both methods might require long and delicate calibration procedures. We have implemented medical ultrasound imaging as an robust technology for detecting human finger motions by predicting finger forces. These finger forces can be individually predicted using forearm cross section ultrasound images provided by a simple probe, after short and easy calibration procedures. Moreover, our method leaves the subject's hand completely free to operate. Our novel HMI is used to play piano in a virtual environment, which combines a fast collision detection with a physics engine, and a sound player controlled by the collision forces. Our integrated system can be used as an entertainment device, for rehabilitation, and for recovery from phantom-limb pain for amputees.
This is a draft version of the video presented at the IEEE ICRA 2014 in Hong Kong. The original video was finalist of the Best Video Award at this conference.