As
reported by IEEE Spectrum:
Super-accurate GPS may soon solve three robocar bugbears: bad weather, blurred lane markings, and over-the-horizon blindspots. These are things that cameras and LIDAR can’t always see through, and radar can’t always see around.
A group led by Todd Humphreys, an aerospace engineer at the University of Texas at Austin, has just tested a software-based system that can run on the processors in today’s cars, using data from scattered ground stations, to locate a car to within 10 centimeters (4 inches). That’s good enough to keep you smack in the middle of your lane all the time, even in a blizzard.
“When there’s a standard deviation of 10 cm, the probability of slipping into next lane is low enough—meaning 1 part in a million,” he said. Today’s unaided GPS gives meter-plus accuracy, which raises the chances that the car will veer outside of its lane to maybe 1 part in 10, or even more, he adds.
Those aren’t great (well, at least not desirable) odds, particularly if you’re driving a semi. Lane-keeping discipline is non-negotiable for a robocar.
The Texas team, which was backed by Samsung, began with the idea of giving smartphones super-GPS-positioning power. But though that idea worked, it was limited by handsets’ inadequate antennas, which neither Samsung nor any other vendor is likely to improve unless some killer app should come along to justify the extra cost.
“We pivoted then, to cars,” Humphreys says.
Humphreys works on many aspects of GPS; just last month, he wrote for
IEEE Spectrum on how to
protect the system from malicious attack. He continues to do basic research, but he also serves as the scientific adviser to 'Radiosense', a firm his students recently founded. Ford has recently contacted them, as has Amazon, which may be interested in using the positioning service in its planned fleet of cargo-carrying drones. Radiosense is already working with its own drones—“dinner-plate-size quadcopters,” Humphreys says.
Augmented GPS has been around since the 1980s, when it finally gave civilians the kind of accuracy that the military had jealously reserved to itself. Now the military uses it too, for instance to land drones on aircraft carriers. It works by using not just satellites’ data signals, but also the carrier signals on which the data are encoded. And, to estimate distances to satellites without being misled by the multiple pathways a signal may take, these systems use a range of sightings—say, taken while the satellite moves in the sky. They then use algorithms to locate the receiver on a map.
But until now, it only worked if you had elaborate antennas, powerful processing, and quite a bit of time. It could take one to five minutes for the algorithm to “converge,” as the jargon has it, onto an estimate.
“That’s not good, I think,” Humphreys says. “My vision of the modern driver is one who’s impatient, who wants to snap into 10-cm-or-better accuracy and push the ‘autonomy’ button. Now that does require that the receiver be up and running. But once it’s on, when you exit a tunnel, boom, you’re back in.” And in your own lane.
Another drawback of existing systems is cost. “I spoke with Google,” says Humphreys. “They gave me a ride in Mountain View, Calif. in November, and I asked them at what price point this would be worth it to them. They originally had this
Trimble [PDF] thing—$60,000 a car—but they shed it, thinking that that was exorbitant. They [said they] want a $10,000 [total] sensor package.”
The Texas student team keeps the materials cost of the receiver system in the car at just $
35 per car, running their software-defined system entirely on a $5 Raspberry Pi processor. Of course, the software could piggyback, almost unnoticed, on the powerful robocar processors that are coming down the pike from companies like
Nvidia and
NXP.
Just as important as the receivers is the ground network of base stations, which the Texas team has shown must be spaced within 20 kilometers (12 miles) for full accuracy. And, because the students’ solar-powered, cellphone-network-connected base stations cost only about $1000 to build, it wouldn’t be too hard to pepper an entire region with them.
You’d need more stations per unit of territory where satellite signals get bounced around or obscured, as is the case in heavily-built-up parts of most cities. It’s tough, Humphreys admits, in the urban canyons of Manhattan. Conveniently, though, it is in just such boxed-in places that the robocar’s cameras, radar, and lidar work the best, thanks to the many easily recognized buildings there that can serve as landmarks.
“Uber’s engineers
hate bridges, because there are not a lot of visual features,” Humphreys notes. “They would do well to have 10-cm precise positioning; it can turn any roadway into a virtual railway.”
So, what’s next, after cars? Humphreys is still looking for the killer app to justify super-accurate GPS in handheld systems.
“We’re looking into
outdoor virtual reality,” Humphreys says. “You could put on a visor and go on your favorite running trail, and it would represent it to you in a centimeter-accurate way, but artistically enhanced—maybe you’d always have a blue sky. You could craft the world to your own liking.” While staying on the path, of course.