This is a Project that was done for the Digital Ecologies 2013 during my Msc Adaptive Architecture and Computation at University College London
This term I tried to use an old Delta Robot and animate it. The object is to make the Robot to behave in a way that seems "alive" but most importantly to simulate in a certain degree how people pay attention or what seems most important. This involves how to use computer vision to distinguish important data of the real world like movement or people and map them in the behaviour of the robot. Besides developing a series of behaviours from computer vision an effort was given on how the robot could better adapt to the world, from gathering data and rethinking its behaviour.