The team from the Camera Culture group at MIT’s Media Lab set out a new approach to “time-of-flight” imaging, which gauges distance by measuring the time it takes projected light to bounce back to a sensor.
At a range of 2m, existing time-of-flight systems have a depth resolution of about a centimetre, the group said, meaning the sensor “knows” an object’s distance to within an accuracy of 1cm. This resolution is good enough for assisted parking and collision detection systems on today’s cars but not suitable for self-driving vehicles, the team said.
“As you increase the range, your resolution goes down exponentially,” said electrical engineer and first author Achuta Kadambi. “Let's say you have a long-range scenario, and you want your car to detect an object further away so it can make a fast update decision. You may have started at [a resolution of] 1cm, but now you're back down to a foot or even 5ft. And if you make a mistake, it could lead to loss of life.”
In contrast, the researchers said their new technique made depth resolution 1,000 times more accurate. At a distance of 2m, the team said its system had a depth resolution of three microns, or millionths of a metre. During optical fibre testing with a range of 500m, the system reportedly had a resolution of 1cm.
The technique combines aspects of Lidar – light detection and ranging – and interferometry with principles from acoustics. Lidar measures distance by timing how long a beam of light takes to return to a sensor, while interferometry fires one beam and keeps another bouncing around within the system. When the fired beam returns, the two waves are combined and the system analyses any difference in their alignment, giving a precise measurement of how far the beam travelled – and therefore the distance of cars or other objects on the road ahead.
The gigahertz-frequency pulses of light used could also help overcome the issue of seeing through fog, a key issue for self-driving cars which could limit their roll-out. Described as a “mortal enemy” of traditional Lidar by electrical engineer Alex Wyglinski in Forbes, fog deflects returning light signals so they arrive late or at odd angles.
At high frequencies, the deflected signals are more likely to align with each other and cancel each other out. Work done at the University of Wisconsin and Columbia University suggests the cancellation would allow sensors to identify the true signal much more easily, the team said.
The research was published in IEEE Access.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.