But suddenly those eyes shoot a glance towards something in the room. Moments later, a limb shoots out towards the object with a whirr of joints and motors.
Robots have been part of the production line for decades, but until recently they’ve been kept apart from humans for safety reasons – either inside physical cages or behind invisible fences that act as a kill-switch when crossed.
There have still been accidents. In the US, 33 people have been killed by robots since 1979. In June 2015 in Germany, a 22-year-old worker at a Volkswagen plant north of Frankfurt was killed when a robot he was assembling moved with no warning and crushed him against a metal plate.
Baxter’s cartoon eyes might sound like a gimmick, but that’s exactly the kind of tragic incident they’re designed to prevent. The robot is part of a new generation of collaborative technology, designed to break down the walls between people and machines.
These ‘cobots’ are cheaper, smaller and more versatile than traditional manufacturing robots. They can help human workers to lift heavy parts, or take care of the boring, repetitive aspects of a task such as lifting objects off a conveyor belt or putting them in boxes. They can be set up in a matter of hours, compared to weeks of specialised programming, and can be switched between tasks in minutes. Workers can teach a cobot new skills by moving its arms in the desired pattern.
“They’re essentially PCs with arms,” says Jim Lawton, chief product and marketing officer at Rethink Robotics, which makes Baxter and its one-armed brother Sawyer.
Albrecht Hoene also draws an analogy with the early days of personal computing. He is the director of human-robot collaboration at KUKA, a German company that makes the LBR iiwa collaborative robot. “It’s not a replacement, it’s a support,” he says.
Currently, the market for cobots is small. They made up only 5% of the 240,000 manufacturing robots sold in 2015. But, with an average price of just $24,000, they’re expected to take off in the coming years among smaller companies – firms that lack the funds and programming expertise for traditional robots, or those that want to produce personalised products at a mass-production price point.
According to analysts, the market for cobots will grow by 40% a year for the next five years, and could reach £3 billion by 2020. They’re expected to move beyond the factory floor to start providing support and assistance in public spaces and in our homes. But first, researchers and robotics manufacturers need to make them safer, better and more flexible. To do that, they need to make them more human.
At the Audi factory in Hungary, a pair of robots at the end of the production line works in tandem, taking measurements from the finished cars as they roll past. The workers have given them nicknames. They call them Adam and Eve.
In Belgium, there’s a cobot nicknamed Little Geert, after its human operator, while at other factories cobots have been dressed up in the colours of the local sports teams. Ford released a slick video showing a cobot pouring itself a coffee and joining its team on a refreshment break.
This behaviour is not unusual. There is an adjustment period at first, but the humans soon grow to trust their robot colleagues. Manufacturers go to great lengths to create a comfortable working relationship. That’s why Baxter has a face with a range of emotions. It can, for example, express confusion if it can’t complete a task and needs human input.
But that’s not all. “There’s an aspect of the design we refer to as ‘anticipatory artificial intelligence’,” says Lawton. When Baxter or Sawyer are about to move an arm towards an object, they’ll look towards it, just as a human would do. The worker knows what to expect, and that builds trust. “If I move my arm in an unpredictable, aggressive manner, that would be startling, which is what a robot can do if you don’t infuse it with some of these kinds of characteristics,” he says.
For robots without faces, it’s a bit harder to build trust. At KUKA they do it with different colours of LED lights, while at Arizona State University assistant professor Heni Ben Amor is working on a way for robots to literally project their intentions onto the environment.
His lab has created an augmented-reality approach, where the robot uses a projector to highlight objects it’s about to reach for, or to illuminate the route it’s going to take. “The environment becomes a canvas for the robot to communicate its intent to the human partner,” says Amor.
The way the robots move is also important. KUKA’s robotic arm is programmed not to make movements that are fast and startling, and, although its ‘elbow’ is capable of rotating through 180º, it generally limits its range of movement to what a human could do. “Of course it can do more,” says Hoene. “But we should take care of what a human would expect.” This kind of technology is building trust between man and machine on the factory floor. But it’s only the start of a long learning curve.
High-fives and handshakes
In a London lab that smells of sweat and solder, Petar Kormushev and his students at Imperial College are putting Robot DE NIRO through its paces. The cobot lacks the emotional range of its Hollywood counterpart, but it is quickly learning how to interact with humans.
DE NIRO (which stands for Design Engineering’s Natural Interaction Robot) consists of a Baxter unit mounted on a wheeled platform. There are several other modifications, including infrared sensors, a torso that can be raised and lowered, and a shiny red hat.
The robot’s been learning to complete various tasks, including helping a human to carry a table, hitting a puck with a hockey stick, and completing a buzz-wire task, where it has to pass a loop of wire around a winding course without touching the sides.
In the first task, DE NIRO holds up one end of a sheet of wood while I hold the other, and responds to my movements thanks to force sensors built into its arms. If I push the table backwards, the robot will move backwards, and, if I pull it towards me, the robot moves too.
Teaching robots to understand this kind of physical interaction is complicated, particularly if you want them to be able to transfer the skills they’ve learned to other tasks. “You need algorithms that are able to generalise,” says Kormushev. “General artificial intelligence.”
You can use simulators to speed up the learning process, although at Imperial they prefer to teach the robots using real-world experience. Linking cobots together is another way of speeding up the learning process. The Million Object Challenge is already using Baxter units all over the world to build a shared database of how to grasp and use different items.
“These robots will start to learn from their own experience,” says Lawton. “They’ll share that information in the cloud so that other robots can learn from the experience to allow them all to get better.”
At Arizona State University, Amor is working on robots that can learn from watching two humans interact – they’re equipped with the movement-tracking Xbox Kinect camera. Similar work at the University of California at Los Angeles has taught a Baxter to respond to handshakes, high-fives and other tasks that rely on subtle variations in the body position of its human partner. There is a world of subtle cues from body language that can differ across cultures and between individuals, and which robots need to master. “We need robots that learn the individual preferences of the human partner,” says Amor.
There’s an exciting future ahead. Rethink Robotics is already tinkering with the voice-recognition capabilities of the Amazon Echo personal assistant, for example. According to Lawton, the next generation of cobots will be able to use their cameras to distinguish between people, or even detect human emotion.
Artificial intelligence and machine learning will allow the robots to start making their own decisions. “The analytics are going to allow the robots to learn things that people couldn’t,” says Lawton.
Such developments could take cobots beyond the factory. As cobots get smarter, cheaper and more like humans, they could be used for anything from food preparation in fast-food restaurants to providing assistance for the elderly and disabled. “There’s no doubt in my mind that, relatively soon, collaborative robots will be in every manufacturing environment in the world,” says Lawton. “And it may take a little bit longer, but collaborative robots are also going to be in all of our homes.”
Changing hands – how CoBots see the world
Like humans, collaborative robots need to sense and interact with the world around them. Canadian company Robotiq makes a range of ‘end effectors,’ which attach to the arms of robots and allow them to do that. These are customisable, but most robot arms have three main components. The first is the gripper, a robot hand made for manufacturing. Some companies make them with five fingers, but most factories use two or three.
The second is the wrist camera, which allows the robot to recognise objects in its work area. And, finally, the force torque sensors give the robot a sense of touch, allowing it to detect when it’s grabbed an object. In some cobots, the arms are coated with soft padding in case of collision with a human worker.
In the future, efforts to increase trust between humans and robots could see cobots built with a soft skin like that of humans.