Building good mobile navigation is super hard. So why is Uber trying it?


Redesigning in-app navigation in hopes drivers might actually use it.

Manik Gupta, the head of Uber’s maps product team, describes the current navigation options in its iOS app as “very, very primitive,” while the Android app lacks navigation completely. The coming update, however, will seriously bolster in-app navigation features for Uber drivers with some help from industry veteran, TomTom. Some industry analysts also see this as a possible possible next step toward a self-driving fleet.

“When we started thinking about building this navigation experience for drivers,” he says, “we realized a big different between how traditionally people navigate from point A to point B, versus what drivers do on Uber.” Just think about the complexities of uberPool, for example, where a driver is grabbing multiple passengers while trying to get them all to their destinations in an efficient way.

Uber touts new navigation features like a gentle-on-the-eyes night mode, a system designed to be seen from a phone that’s 3 feet away and mounted on a dashboard, and a preview of the very first turn a driver has to make after pulling away from a curb.

They have “redesigned everything from the ground up,” Gupta says; part of that includes the integration of TomTom’s navigation data. It’s the beginning of a longer process, and Gupta points out that drivers can still use whatever navigational app they want.

Their goal is to “build an experience that is really, really integrated in the app,” Gupta says, custom-made for the complex situations Uber drivers are in. “Obviously we have a long way to go.”

But mapping is complicated. (Just ask Apple, whose first version of Apple Maps was heavily criticized.) It’s a point that Gupta, who used to work at Google Maps, appreciates.

Christopher Mertz, the principal project scientist at Carnegie Mellon’s Robotics Institute, and a cofounder of a company called Roadbotics, has a hunch that there might be more to the story.

“Right now, they’re doing it for navigation,” he says. “But I think even more importantly it will be for their autonomous driving.”

Mertz focuses his research on the perception systems of autonomous cars. He says that map-makers need to keep in mind factors like scale and accuracy, and deal with incorporating data from multiple sources. Self-driving cars, he points out, need to have very fine-grained maps, down to the inch or smaller.

Autonomous driving—which Uber has already piloted in Pittsburgh—requires dealing with variables like, say, whether vegetation is obstructing a view around a corner, Mertz points out. As Uber takes more control over the navigation process, it could help them learn more, he says. “Maps are very important for autonomous driving.”

Part of the broader context is that Waymo (a part of Google parent company Alphabet) is suing Uber over technology relating to self-driving vehicles, although an Uber spokesperson strongly emphasized that the company’s mapping and navigation initiative is completely unrelated to that ongoing litigation. On another point, Gupta says that Google’s tech was never part of Uber’s driver-facing navigation features in its app, and still is not.

As for self-driving cars, Gupta says they have “nothing to announce on that right now,” emphasizing that this update is really about giving the drivers a better experience. “From my perspective, that is what we are focused on right now.”

REF: popsci

Previous articleHello world!
Next articleWhat Are Quantum Computers?