This is the first experiment by Dr. Woohoo in a series that explores the relationship between a robot + a artist with the objective of enhancing what is creatively possible by combining the strengths of each, while using watercolors as the natural media.
The following is a Q&A with Dan Nosowitz:
"Can you tell me a little more about it?" – Dan
"This is the 1st in a series of experiments using different art related media that explore the relationship between the artist, myself in this case, and a robot. I'm searching for the sweet spot in the center between robots and artists, with the objective of enhancing what is creatively possible by tapping into the strengths of each. In the Turbulence project, the first steps towards developing a more enhanced system focus on one-way communication, from the code to the robot, and control.
I created the original design using a generative design application that I developed, with some early help from Jay Moretti, which runs in the browser and is based on Canvas and PaperJS (http://paperjs.org/). In Chrome, I began with a single hexagon and applied a 2d plane symmetrical (http://en.wikipedia.org/wiki/Plane_symmetry) layout pattern to it which creates copies and distributes the hexagons in a prescribed fashion. E.g., for this image, I used the P1 pattern which translates the hexagon in 2 directions.
With the basic layout of the hexagons in place, I then generate a Perlin Noise image as a hidden layer underneath the hexagons in order to determine the colors. For each hexagon, I sample the greyscale value from a single pixel from the Perlin Noise image at the center point of the hexagon. If it's above 50%, meaning that it's brighter, it's red. If it's below 50% and is a darker pixel, the color is blue.
When I'm content with the final look and feel, I export all of the points for all of the hexagons as a CSV file. This CSV file is then imported in a Processing (http://processing.org/) robot controller application – co-developed by Stephane Bersot – that converts it into a script that the robot will understand and sends that script to the robot. The robot, upon receiving it, follows the instructions until it's finished painting." – Woohoo
What's the robot being used?
The robot is Universal Robot's UR10 and the gripper is a 2-finger Robotiq 85.
Why does it draw in that way?
In this video, I'm using 3-4 degrees of freedom (DoF) out of 6 potential, so the gripper and brush are always perpendicular to the canvas. Simply put, controlling 3-4 DoF is easy to accomplish, 6 is much more challenging, but not impossible. Another way to look at it... the robot has 3 wrists in addition to an elbow, shoulder and torso that need to be controlled in perfect synchronization! With 6 DoF, the major advantage will be brushstrokes that are more gesture based, allowing for a closer simulation of a brushstroke. The next experiment will include all 6 DoF.
What was the inspiration for the project?
For the majority of my career, I have focused on developing systems and software applications, plug-ins and scripts that extend what is creatively possible and most of the time they involved emulating natural media. Adding a robot into the equation allows me to return to the past with a modern approach, which to me is the best of both worlds.
From an art perspective, Kandinsky and Josef Albers are as important as Edward Ihnatowicz's The Senster (http://www.senster.com/ihnatowicz/) and Jean Tinguely's drawing machines (https://www.youtube.com/watch?v=FZpEYLa9PGs).
From a literary perspective, Herbert Weiner's (http://en.wikipedia.org/wiki/Norbert_Wiener) work on Cybernetics (http://en.wikipedia.org/wiki/Cybernetics) and specifically the feedback loop is critical in my work. It's not evident in this piece because the feedback from the robot is not being tapped into, but it's there and will be included in future experiments.
The inspiration for the hexagon comes from here: http://instagram.com/p/Z0mAYimluN/
The inspiration for the plane symmetry comes from the San Ildefonso Pueblo's ceramic designs, which I found throughout New Mexico and in Dorothy K. Washburn and Donald W. Crowe's exceptional book, Symmetries of Culture: Theory and Practice of Plane Pattern Analysis (http://www.amazon.com/Symmetries-Culture-Practice-Pattern-Analysis).
From a software perspective, I was also inspired to develop the generative browser application because I wasn't able to match the functionality I wanted in applications like Adobe Illustrator or Inkscape and was curious to see how far away from desktop applications and into the cloud I could move and discover the current limits of cloud base applications. I have been a fan of PaperJS and the developers behind it – Jürg Lehni & Jonathan Puckey – so it was the perfect tool for me to use for this experiment.
Music: Kid Koala
Video was selected as part of the Rob | Arch 2014 conference vimeo.com/robotsinarchitecture