Articles
Astronauts on the International Space Station live with a robot co-worker.
The humanoid Robonaut 2 has been on board since 2012. While it can help humans with tasks such as removing air filters, it struggles to move around on its own – because GPS doesn’t work well 250km above Earth.
Computer scientist Dezhen Song at Texas A&M University wants to help, and is working with Nasa to help the robot get from A to B witout getting lost. The robot has arms and hands to move and pass tools, and is intended to free up time for the space station crew. As the Robonaut doesn’t need life support for space walks, it is also ideal for carrying out inspection and maintenance of the ship’s exterior.
However, it will not be able to do this kind of advanced duty until it can find a way of making a mental map of the space station to help figure out where it is – there is no point telling Robonaut to fix an air filter in the bathroom if it doesn’t know how to find it.
This isn’t an easy task, particularly because the 136kg robot must try to carefully move around the narrow corridors of the space station and work closely with human astronauts. “We have little room for error in localisation and obstacle avoidance,” says Song.
Currently Robonaut has quite low levels of autonomy owing to its lack of ability to work out where it is. But Song stresses that it is still a “sophisticated humanoid robot”. “Our colleagues at Rice University in Houston, Texas, Professor Lydia Kavraki and Dr Mark Moll, have made striking progress in its motion planning aspect,” says Song.
Robonaut can move its limbs without colliding with obstacles provided it is in a stationary environment it is familiar with, and perform jobs much like an industry robot in a work cell. But Song’s task is to increase its mobility by providing localisation and mapping using onboard sensors.
Song’s solution to help Robonaut create a mental map of the places it visits is to use simultaneous localisation and mapping (Slam). Providing a reliable, low-cost Slam system has been an obstacle for many robotic applications. Cameras can be used instead of laser range finders but a camera won’t work well in low light and doesn’t easily process information coming from more than one location. Cameras also have difficulty measuring distance accurately.
One idea Song and his team have come up with is to use multiple cameras at certain heights to provide a precise distance reference: known as stereo vision. Fixed cameras are quite limiting but Robonaut is able to move his head from side to side to scan and build up a more detailed and larger field of view. Neck encoder readings help the team to track Robonaut’s head as it moves.
An inertial measurement unit also provides data about Robonaut’s body movement as it travels around. But the challenge still lies in combining all this data with the various camera images to provide trustworthy Slam data.
If Song’s team’s algorithm works in the ground testing unit they have, which is the same as its twin in space, he says there is “no reason for holding this back”. He adds that loading the code to the twin in space will be the same as copying files between PCs.
Song says that developing robotics in space is simpler than building a robot to do chores on Earth because our planet has more unexpected environments and scenarios. “A robot living in a space station can recognise everything in its surroundings given today’s recognition capabilities and the limited number of items around it,” says Song.
The team hopes to develop the Slam technology for more complicated scenarios on Earth, such as helping vehicles in areas without GPS coverage to find their way. Song is in talks with a couple of firms to optimise Slam for Google Tango, an augmented-reality computing platform. General Motors has also expressed interest in the technology.