The system, called TurboTrack, could also be applied in manufacturing to enable greater collaboration and precision between robots working on packaging and assembly.
Created by researchers at the Massachusetts Institute of Technology, TurboTrack uses low-cost RFID tags, which can be applied to any object. A reader sends a wireless signal that reflects off RFID tags and other nearby objects, rebounding to the reader. An algorithm sifts through reflected signals to find the RFID tag, before final computations use the tag's movement to improve its localisation accuracy. The researchers showed that robots using the system could locate tagged objects within an average of 7.5 milliseconds, with an error of less than 1cm.
To validate the system, the team attached one RFID tag to a bottle cap and another to a bottle. A robotic arm located the cap and placed it onto the bottle, held by another robotic arm. In another demonstration, the researchers tracked RFID-equipped ‘nanodrones’ during docking, manoeuvring and flying. In both tasks, the system was reportedly as accurate and fast as traditional computer-vision systems, while working in scenarios where computer vision fails.
"If you use RF signals for tasks typically done using computer vision, not only do you enable robots to do human things, but you can also enable them to do superhuman things," said Fadel Adib, an assistant professor and principal investigator in the MIT Media Lab, and founding director of the Signal Kinetics Research Group. "And you can do it in a scaleable way, because these RFID tags are only three cents each.”
Nanodrones currently use computer vision and other methods to stitch together images and find objects or other drones. However, the researchers said they often get confused in chaotic areas, lose each other behind walls, and cannot uniquely identify each other – limiting their ability to spread out over an area and collaborate to search for a missing person, for example. Using the researchers' system, nanodrones in swarms could better locate each other, for greater control and collaboration.
"You could enable a swarm of nanodrones to form in certain ways, fly into cluttered environments, and even environments hidden from sight, with great precision," said first author of the research paper, Zhihong Luo.
The researchers said the system could replace computer vision for some robotic tasks. As with its human counterpart, computer vision is limited by what it can see, and it can fail to notice objects in cluttered environments. Radio frequency signals have no such restrictions – they can identify targets without visualisation, within clutter and through walls. In manufacturing, this could enable robot arms to be more precise and versatile in picking up, assembling and packaging items along an assembly line.
The research will be presented next week at the USENIX Symposium on Networked Systems Design and Implementation in Boston, Massachusetts.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.