This video is based on my earlier work on skeleton tracking and the motion transfer to a virtual robot. I augmented the codebase to actually control the NAO robot in Choregraphe. I really don't know how all this (especially the lean forward) will look (work) like on a 'real' NAO. I may give it a try during the next week. For the curious people: I wrote the extraction of the skeleton angles in C++ (OpenNI), modeled the Blender stuff in 2.63a, implemented the middleware support in Python 3.2.1 (LCM) and controlled the NAO via NAOqi in Java. All this runs on a recent MacBook Pro with Ubuntu 12.04 (64Bit). This is quite I zoo over here ;)

You may check my blog for details: warp1337.com

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…