Mimetics is a gesturally responsive robotic installation that made up one of the six exhibits at Move Me
- The Mill’s one-day event of art and technology hosted in New York.
Spearheaded by Creative Engineer Noel Drew, Mimetics employs four robots equipped with marker pens that draw on a large sheet of set paper. Within the confines of the paper’s frame, they can whiz around and create art unassisted, but also be gesturally controlled by the crown using Leap Motion sensors.
The final result; a crowd-sourced collaborative mural and historical record of interaction.
How were they built?
The robots are small two tracked units (WLH 10cm x 11cm x 5cm not including pen clip). They are based on a type of robot called a Zumo bot but, apart from the chassis, are a custom build. They were individually named after contemporary American artists. This was decided during testing and development when people commented on how the early drawings resembled a Jackson Pollock
The circuits were all built and programmed in- house at The Mill by Noel, and the pen clips were modeled and printed via a 3D printer.
At the heart of the robots is a NodeMCU development board built around the ESP8266 Wifi chip. This allows the robots to communicate over a Wifi network, receiving commands that control the speeds of the left and right track or that tell the robot to go back into autonomous mode. Ultrasonic sensors give the robots “eyes” allowing them to sense object in front of them and change course accordingly.
How do they behave?
When doing their own thing, the bots are programmed with different “personality” traits. Pollock is the slowest of the four and pauses regularly between short stints of slow activity. Newman is quite the opposite; he is the fastest of the four and has periods of extreme hyperactivity intercut with long rests. He is also a little short sighted and is the most likely to actually collide with something before realizing it is there.
Klein and Warhol are very similar; they are both programmed with identical algorithms that define their movement. On wake they generate a number of random values for some basic variables and this influences their individual characteristics. For instance one might lean towards drawing long lines at a moderate speed whereas the other might prefer frantically drawing circles.
How does the exhibit work?
The human interaction comes from two Leap Motion hand tracking devices. An application developed especially for the installation handles reading the data from the leap motions and translates the orientation of the user’s palm into forward, backward, left and right commands, calculating left and right track speeds, and communicating with the robots over Wifi. The moment a guest’s hand is detected over the sensor, one of the robots ceases to be autonomous and listens to commands being sent to them by the guest. The moment the hand is removed, they instantly return to autonomous mode.
What’s next for Pollock, Newman, Klein and Warhol?
The next goal for the project, which is currently in development, is to incorporate machine learning into the application to update and learn from variations in human gestures and make the interaction with the piece more intuitive over time.
Read more about the exhibition as at Move Me here or follow the hashtag #MillMoveMe.