openFrameworks, Robotics
Leave a comment

Manus – Exploring pack behaviours in autonomous robots

Created by Madeline Gannon for the 2018 Annual Meeting of New Champions at the The World Economic Forum, in Tianjin, China, Manus is a set of ten industrial robots that are programmed to behave like a pack of animals. While each robot moves independently, they share the same central brain. So instead of acting in isolation, they have intertwined behaviors that ripple through the group as people walk by.

Manus uses 12 depth cameras to detect and track the body language of approaching visitors. The robots then use this depth data to decide who to move towards, and whether to look at a their hands or face. Like many of today’s intelligent autonomous machines, the robots in Manus don’t look like and they don’t act like us — but the team hopes to show that they can still connect with us in meaningful ways.

Manus features ten ABB IRB1200 -5/0.9 industrial robot arms. These machines are more common found in factories doing food handling or painting car chassis. However, in Manus, the team developed custom vision and communication software for embedding autonomous behaviours into these machines. Their vision system uses 12 depth sensors in the base of the installation to give the robots a worms-eye view of the world. This gives the team a 1.5 meter tracking region all the way around the 9 meter base. The tracking system looks for 3D positions of a person’s head and hands to pass on to the robot control software.

Their custom control software uses tracking information from the vision system as input for interaction. Each robot decides whether and how to engage with tracked people, based on their spatial relationship to one another (the person’s body language). If a person of interest is too far away, a robot may decide they are not interested enough to look at them. Similarly, if a person is very close, robots may change their gaze to look at the person’s hands or head.

A single PC runs the vision and robot control software, and communicates over ethernet to the physical industrial robots using ABB Externally Guided Motion (EGM) protocol. The team built a UDP server to manage message passing between the real and virtual robots, and to better coordinate desired versus actual poses of the ten robots.

Manus was made using openFrameworks, with help from the following contributor addons: ofxCv, ofxEasing, ofxGizmo, ofxOneEuroFilter

Project PageMadeline Gannon

Credits: Madeline Gannon, Julián Sandoval, Kevyn McPhail, Ben Snell // Sponsored by NVIDIA and ABB Ltd.

Leave a Reply

Your email address will not be published. Required fields are marked *