The Kinect sensor's IR depth camera and rgb camera was accessed through processing. The closest point to the sensor produces an x,y coordinate set that controls signals being sent to the remote control. A red dot is placed on the screen to let the user know what is the closest point. This is often the hand or extended arm as demonstrated.

An Arduino micro-controller was used with an Adafruit Motorshield to control 2 servos that physically turn the potentiometers your thumbs usually move on the helicopter's controller. The previous hand held interface is replaced with motors that receive signals to move based on your gestures.

Both Arduino and Processing were used. The processing library for Kinect is thanks to Daniel Shiffman and can be found at

Find the source code here for the project.

more at

# Uploaded 5,102 Plays / / 3 Comments Watch in Couch Mode

CodeLab CMU


This channel features videos of experiments, investigations and projects produced by students and alumni of the Computational Design Lab at Carnegie Mellon University.

Browse This Channel

Shout Box

Channels are a simple, beautiful way to showcase and watch videos. Browse more Channels. Channels