The video showcases a human-robot cooperative assembly with a Baxter robot. The robot is equipped with 3 activities: Pick and Give a part from the feeder and Hold a part in place to help the operator screw.
A Kinect 2 sensor is used for gesture recognition, speech recognition and text-to-speech, coupled with a C# server streaming features of the Kinect 2 over the network to Linux workstations (Python and/or ROS). Code is online and developed for the 3rd hand project:
Skeleton estimation of the Kinect 2 and gestures (red blue and green circles at hands) can be seen in background on the TV.