Environment, Robotics
Leave a comment

Quipt – Teaching industrial robots spatial behaviours for human interaction


Created by Madeline Gannon – researcher, designer and educator at Carnegie Mellon University’s School of Architecture and also a PhD candidate in Computational Design, Quipt is a gesture-based control software that facilitates new ways to communicate with industrial robots. Using wearable markers and a motion capture system, Quipt gives industrial robots basic spatial behaviors for creative safe and intuitive interaction with people. Wearable markers on the hand, around the neck, or elsewhere on the body let a robot see and respond to users in a shared space. This lets users and the robot safely follow, mirror, and avoid one another as they collaborate together.

Industrial robots are truly incredible CNC machines – not just for their speed, power, and precision, but because they are also highly adaptable. Unlike other CNC machines, when you put a tool on the end of the robot, you completely transform what it can do: put a sprayer on it, and it becomes a painting robot; put a gripper on it, and it becomes a material handling robot; put a welder on it, and it becomes a spot welding robot.

Quipt augments an ABB IRB 6700 industrial robot by giving it eyes into its environment. Using a Vicon motion capture system, the software is structured to receive and reformat motion capture data into corresponding movement commands for the 6700. Movement commands are generated using our open-source library, Robo.Op (see it on github). Quipt also visualizes debugging data in an Android app, so a human collaborator has a mobile, continuous view of what the robot is seeing.

Learn more about Quipt on the instructable here.

Project collaborators include Julián SandovalZack Jacobson-WeaverDeren GulerEric BrockmeyerJakob Marsico and Mauricio Contreras.

Project Page | MADLAB | Pier 9’s Artist in Residence program

Previously on CAN: Chronomorphologic Modeling by Madeline Gannon

/via Golan