Will Nissan Beat Google and Uber to Self-Driving Taxis?

Japanese automaker plans fleet of prototype autonomous cabs within two years

3 min read

Will Nissan Beat Google and Uber to Self-Driving Taxis?
These Nissan taxis are conventional gasoline-powered vehicles, driven by humans. But Nissan is collaborating with NASA to develop a fleet of electric autonomous cabs.
Photo: Nissan

Who will build the first robot taxis? Google has a working prototype but no experience in manufacturing cars. Uber, meanwhile, knows the transportation business but has only just started working on autonomous vehicles with Carnegie Mellon University.

Documents obtained by IEEE Spectrum suggest the first cab capable of driving itself (and that you won’t feel obliged to tip) might be made by Nissan. In January, the Japanese automaker announced that it would be working with NASA to “demonstrate proof-of-concept remote operation of autonomous vehicles for the transport of . . . goods . . . and people.” Using a California Public Records Act request, Spectrum has uncovered more details on the particular technologies Nissan and NASA plan to share and, more important, that the main goal of their collaboration appears to be the development of a fleet of remotely-supervised autonomous taxis.

The documents reveal that Nissan has set an aggressive schedule for the project, planning to have prototype cars operating within two years. Google and Uber are both rumored to be pursuing the same goal, so a race for the mythical robo-taxi is on. (A European consortium is in the race too.)

Nissan has long been a champion of self-driving cars, with its chairman and CEO Carlos Ghosn claiming the company would introduce autonomous vehicles by 2018. The company also makes taxis, such as the NV200 “Taxi of Tomorrow,” which is about to replace New York City’s traditional Ford Crown Victoria cabs. An all-electric model, the e-NV200, is already on sale in Europe.

The agreement between the Nissan Research Center Silicon Valley and NASA’s Ames Research Center calls for the space agency to “provide expertise to develop and test supervisory control of multiple autonomous vehicles for transport service.” If this sounds futuristic, it’s because it is: the project will include modifying NASA’s software for operating planetary rovers, visualizing their surroundings, and controlling them with telerobotic interfaces. NASA will also assist Nissan in the design, development, testing, and assessment of prototypes.

imgAn all-electric Nissan Leaf equipped with autonomous drive systems parked at NASA’s Ames Research Center in Silicon Valley.Photo: NASA/Ames/Dominic Hart

That assessment will happen right at NASA Ames, a complex of buildings, labs, and hangars in the heart of Silicon Valley. A number of streets and parking lots will act as a practical test bed for Nissan’s autonomous electric vehicles (Leafs at first), potentially alongside Google’s self-driving cars, which are also due to begin testing there this year. The parking lots in this area will be closed to all other pedestrians and vehicles. In return, Nissan will pay NASA an undisclosed sum. Nissan declined to comment on this story.

One of the technologies that NASA will modify is its Robot Application Programming Interface Delegate (RAPID), open source software that simplifies communications between robots and their command-and-control systems. RAPID has been used with walking and flying robots, as well as in an experiment involving a wheeled rover on Earth controlled from the International Space Station. Nissan will also make use of NASA’s Vision Workbench, an image processing and computer vision library, and algorithms from NASA’s rover software for robotic exploration.

Particularly useful for Nissan is likely to be NASA’s Virtual Environment for Remote Visual Exploration (VERVE). This interactive 3D visualization tool was designed to incorporate multiple data feeds from Mars rovers, including stereo video cameras, LIDAR systems, digital compasses, and inertial measurement units—some of the same sensors found on self-driving vehicles today.

“Typically we work with rovers to explore terrain that has not been well mapped,” says a document written by researchers at NASA’s Intelligent Robotics Group, which developed VERVE. “As the rovers traverse an area, they build more accurate maps.” This fits well with Nissan’s autonomous vehicle technology. Unlike Google’s self-driving cars, which rely on ultra-detailed maps accurate to centimeters, Nissan uses what it calls “sparse maps,” created with third-party data that offers much less detail. If effective, this approach promises to be easier to scale than Google’s expensive and data-intensive maps that, even now, cover little more than the company’s home town of Mountain View.

According to the Nissan-NASA agreement, the taxi demonstration will integrate a service-oriented software architecture, road map, and database. Software will include a “telerobotic user interface,” “real-time performance monitoring,” and “automatic event summarization,” the documents say. Any lessons learned are intended to “enable Nissan North America to better plan for development and commercialization of autonomous vehicles and applications.”

That commercialization won’t start at Ames, however, as workers at the base are forbidden from using the prototypes as their own personal taxis. The agreement stipulates: “Ames personnel will not participate in testing involving the transport of humans in conjunction with their daily official duties or their personal activities (ie transiting to meetings, lunch etc).” To avoid robotic traffic jams, NASA also reserves the right to limit the number of Nissan vehicles on the Ames campus at different times of day.

Nissan already has a thriving taxi business and some of the most accomplished autonomous vehicle prototypes in the world. If NASA can swiftly adapt its rover technologies into the supervisory and control systems Nissan needs, sleekly space-age robot taxis might be pulling up to the curb sooner than many people think.

The Conversation (0)