openFrameworks, Robotics
Leave a comment

Design I/O’s Mimic – Putting emotional machines within arm’s reach

Robots have been a staple on assembly lines for decades and we’ve recently welcomed them into our homes to help out with menial tasks, but what happens when we bring machines designed for light manufacturing into social spaces? Mimic, the most recent project by the interactive installation studio Design I/O convincingly sketches how we might socialize with robots – and how they can be imbued with personality so they can return the favour. One of the centerpieces of this year’s edition of TIFF Kids International Film Festival’s digiPlaySpace interactive playground, the studio plunked a Universal Robots UR5 robot arm on a plinth in the middle of a exhibition hall full of fired-up kids. Using sophisticated computer vision rig (comprised of three overhead Kinects and one Kinect V2 sitting in front of the installation) and working around the arm’s limited built-in functionality, Design I/O have created a playful and inquisitive being with a range of dynamic moods and actions that are triggered in response to whatever motions or gestures it detects.

“You can get a bit of a wallop from it. You could definitely bruise yourself if you’re in the wrong place,” says Theo Watson of the responsibility implicit in taking control of the UR5. Beyond the necessity of putting a bit of space between the arm and curious children, a range of safeguards were needed to protect the arm from itself. “At the very end of development I found a way to get it working with a new interface which basically allows you to say ‘I want you to be in this position and just get there.’” This sounds simple, but the UR5 is finicky about how it takes instructions; Watson says you have to treat it “kindly” – or else. “If you say ‘move there within a sixtieth of a frame’ it will just shut down.” Likewise it is fully capable of whacking itself due to its expansive range of motion. Despite these difficulties the UR5 was definitely the right robot for the job. “It’s exactly the form factor we wanted, something roughly the scale of a human arm.” The model’s reach is just under a metre, which makes it human-scale, especially compared to the daunting size of larger models.

While making the UR5 nimble was laborious, Design I/O were thrilled to breathe life into the arm and craft its personality. Like the whimsical animals of Connected Worlds (each with distinct physiology, a preferred biome, and innate tendencies) the robot arm needed a prerogative and disposition to drive how it responded to stimulus. “The feelings change based on how you act. One example would be timidness, if you walk aggressively towards it, it will feel threatened and retract a little bit.” Design I/O’s Nick Hardeman explains of the ‘emotional inventory’ they built for the robot. “There’s also trust – she’s been there for a while” he says pointing at a young girl planted in front of the arm, staring at it intently. “It might reach out to her because it feels comfortable with her.” These emotions (along with interest, playfulness, curiosity) are calculated by who is in front of the arm, how long they have been there, and what they are doing. A slow and gentle presence may lead to the UR5 coming out of its shell, while more spastic gestures might send it reeling – or be reciprocated with its own exaggerated motions.

Underneath all that sophisticated character design there is a lot of computer vision heavy lifting going on. The three (chained) Kinects that surveil the installation from above evaluate proximity to the arm, and the V2 that is front and centre observes individuals in the sweet spot; it analyzes gesture, speed, heading, etc. of each person and uses that data to inform how the arm responds. It also needs to be able to decipher the inevitable crowds that form so the UR5 can figure out where to direct its gaze (hint: like most humans, new and/or dramatic movement will capture its attention). Then of course there are the safety protocols that need to be in place for if anyone breaches the perimeter fence.

Unsurprisingly Mimic required the mobilization of several networks. The project grew out an idea that had been back burnered during previous collaborations between Design I/O and (former) digiPlaySpace curator Nick Pagee. It also received support from the Frank-Ratchye STUDIO for Creative Inquiry at Carnegie Mellon, which hosted Watson and Hardeman for a related residency. Watson notes that many of the problems Design I/O encountered echo those lab director Golan Levin faced in his (classic) Double Taker (snout) project in 2008, the likes of which are now being researched in his lab by Madeline Gannon and Dan Moore (Mimic draws on their ofRobotArm and ofxURDriver openFrameWorks libraries).

At a lecture hosted by Ryerson University’s New Media Program three weeks ago Watson shed some light on where this new foray into robotics sits in relation to his studio’s other works. While there is an impulse to describe it as a move away from augmented environments into physical media, Mimic is very much a return to their roots: the age-old desire to make inanimate form expressive drove their breakthrough project Puppet Parade (presented at Cinekid in 2011). “That’s the strange thing,” Watson says of his initial queries to the UR5 manufacturers, looking for resources to aid in Design I/O’s ambitions to make the arm truly responsive versus automate it to run scripted routines. “When we asked for help about the real-time stuff they just said ‘oh, we never do that’ – and these are its makers!”

Project Page | Design I/O
Mimic is installed at TIFF’s digiPlaySpace through April 23rd