Toyota today revealed some of the inner workings of an automation package meant to help drivers rather than replace them. The company also said that if that package had been in operation, it could have prevented or mitigated a recent three-car accident in California.
The announcement came at CES 2019, which takes place this week in Las Vegas. [Disclosure: Toyota paid for the author’s expenses to attend CES.]
Toyota has often spoken of its two-stage research project for self-driving cars. In the long run, it plans to offer a truly driverless technology called Chauffeur. In the meantime, it will exploit that technology in a driver-assistance feature, called Guardian. Chauffeur is years from commercialization, but Guardian will be installed in some Toyota models “soon,” says Gill Pratt, who leads self-driving research at the Toyota Research Institute.
Today’s reveal is Guardian’s use of a technique pioneered in the automation of fighter jets, in which the pilot steers a stick that feeds data not directly to the plane, but to a computer system. That system employs various data streams to fine-tune the pilot’s commands millisecond by millisecond, thus keeping the plane within its aerodynamic limits—its “flight envelope.” Some modern military jets couldn’t stay airborne very long without a computer as an intermediary.
Toyota refers to its automotive version of this idea as blended envelope control, emphasizing what it calls the smooth, “near-seamless blend” of the efforts of man and machine. And though staying airborne isn’t on the agenda, the company maintains that envelope control is much harder to sustain in a car than a fighter jet. A car must notice what’s going on in the immediate environment—say, a truck passing on the left—and predict what’s about to happen—say, a pedestrian who may or may not cross the road.
Besides serving as backup to a human driver, Guardian could also operate with another autonomous driving system, either from Toyota or some other company. Pratt said that though the system was called Toyota Guardian, the company envisaged providing it as a universal backup system: “Guardian for All,” he called it.
He wouldn’t name a company that had agreed to use Guardian as a backup for its own cars, either the conventional or the self-driving kind. Toyota does, however, have a partnership with Uber that involves installing Guardian on a Toyota Sienna test vehicle. Moreover, Toyota has explicitly targeted the technology for the “mobility as a service” market, which today centers on ride-hailing companies, such as Uber.
He showed a video in which a driver swerved repeatedly to stay within a curving path lined by plastic cones, knocking over the odd one. Then the driver re-ran the course with Guardian on and stayed within the lines without fail. The driver might almost forget the help he’s getting, and attribute the success to his own powers.
The point is to provide safety through redundancy, a key feature in most mission-critical systems, such as those in spacecraft. But putting a machine in between a driver and the car puts a lot on the software. If some glitch in the code arises, you could find yourself in a heap of trouble. And if the machine takes over too much of the work, the driver might grow complacent or lose their seat-of-the-pants flying (or driving) skills. Pilots call this the paradox of automation, and it appears to have played a part in the recent crash near Jakarta of a Boeing 737 operated by Lion Air.
To ensure safety, the early airplane automation systems paired the electronic system with legacy mechanical connections to actuators, like control flaps. But that meant renouncing the weight-saving advantages of flying “by wire.”
Another way of staying safe is by using the computer very sparingly at first. Engineers can then gradually ramp up the automation without risking much. That slow-and-steady approach would seem to be Toyota’s plan with Guardian.
Pratt says that the hard part about training robocars is in gathering enough of the right kind of experiences, particularly the dangerous-but-rare events known as corner cases. What makes such cases so hard to anticipate is their combination of several interacting problems.
Pratt recounted a recent corner case that involved a Toyota test car that had no self-driving system running when it was gathering highway data in the San Francisco Bay Area. The car found itself “in the thick” of a three-car accident, one that produced no injuries. Afterward, Toyota engineers were able to reconstruct the event in silico, then re-enact it on a test track using cars equipped with Guardian. Pratt says that the tests determined that Guardian could have mitigated or even prevented the accident.
Toyota’s own test car itself will soon be of a sleeker stamp than usual: A modified Lexus LS 500h that houses all the sensors in a single rooftop package. That cleaner line contrasts with today’s typical robocar, which sprouts more antennas than Sputnik. And the computing hardware will fit neatly into the car, unlike the junk-pile trunks of today’s testers. Among the sensors is a new radar designed to provide better near-range detection, and additional cameras to cover potential blind spots.
This story was updated on 7 January and on 12 January.
Philip E. Ross is a senior editor at IEEE Spectrum. His interests include transportation, energy storage, AI, and the economic aspects of technology. He has a master's degree in international affairs from Columbia University and another, in journalism, from the University of Michigan.