How We Gave Sight to the Mercedes Robotic Car
Radar is the key to Mercedes Benz’s autonomous car
It is August 2013, and we are sitting in what looks like a standard S-Class Mercedes, nosing through traffic in a small town in southern Germany. The streets are narrow and jam-packed with cars, and pedestrians are everywhere. Yet nobody has a hand on the wheel, and nobody has a foot anywhere near the pedals. Still, you can’t fault the driving: This car is in charge of itself.
Or herself. We and our colleagues at Daimler call her Bertha, after the wife of Mercedes-Benz founder Karl Benz, who exactly 125 years earlier became the first joyrider in history when she took her two sons for a 100-kilometer jaunt in her husband’s car, from Mannheim to Pforzheim. When the leather brake pads wore out, she found a shoemaker. When the fuel ran out, she bought more from a pharmacist (who marketed it as a cleaning fluid).
Her point was to prove that her husband’s internal-combustion engine was ready for general service. Our point is to retrace her famous route and thus prove that autonomous driving is also a reality in the making. You can see bits and pieces of that future already. Without input from the driver, cars can now park, space themselves out on the highway, hold the center of the lane, and even stop when a crash is imminent. These building blocks of autonomous driving have already saved lives. Putting them together to make the perfect robot chauffeur, however, is still a work in progress.
On its way to Pforzheim, our car had to deal autonomously with a number of highly complex situations, including encounters with roundabouts, crossings, traffic lights, pedestrians, cyclists, and trams. As if steered by an invisible hand, Bertha negotiated heavy traffic and narrow streets; it knew just where to turn, when to change lanes, when to stop, and when to start driving again. How could it see enough to perform these feats?
Look at this S-Class from the outside and you will notice nothing out of the ordinary. Get inside, though, and the first secret appears: Behind the windshield hangs a pair of cameras two hands’ breadth apart. Like your eyes, they provide depth perception. On either side of the windshield there are more cameras that work independently and across a very wide swath of territory. Their job is to recognize traffic signs. Add to that eight state-of-the-art radar sensors, invisible from the outside, which provide close to 360-degree coverage around the vehicle, sensing objects from a few centimeters to as much as 200 meters away.
Most present-day automotive radars represent cars, pedestrians, and other moving targets as points on a plane, each with an arrow indicating the target’s speed and direction of motion. That’s not enough information to make Bertha see, though. We had to get the car’s radars to provide all the information a human driver would want.
His hands are off the wheel, but the driver still serves as a backup.Photo: Mercedes Benz
That was tough. But in the end, we taught the radar to track pedestrians, cyclists, and other vehicles moving through junctions and roundabouts, for example. We coaxed it to provide adequate coverage for making lane changes. And we enabled it to determine the boundaries of the lane the car was in up to 140 meters ahead. We also introduced the first algorithms capable of deducing the dimensions of other vehicles or stationary objects. Even more important, those algorithms can tell the difference between a pedestrian and a fence post.
The main challenge in autonomous driving has always been how to teach the vehicle to know where it is, recognize what it sees, and react appropriately. Just as people recognize objects by taking into account their movement, color, shape, and size, autonomous vehicles are at their best when using many different types of sensors. Those so far include ultrasonic and infrared sensors, optical video systems, laser scanners, and radar.
Google’s celebrated autonomous car employs an elaborate set of sensors. For detecting objects, it uses a laser scanner mounted on the roof and long-range radars affixed to the front of the car. For recognizing traffic signs and signals, it uses a high-resolution video camera. The data from all this equipment are then superimposed on a stored digital map. This multilayered approach allows the car to manage inner-city traffic all by itself. It’s a great technical accomplishment, one that has energized the entire auto industry. However, until the cost of the equipment comes down, this strategy is perhaps not so practical or economical.
Bertha’s design is based on a very different approach, one that relies on compact optical cameras—and radar.
To most drivers, the word radar conjures police radar guns, which detect the speed of a targeted car. Unlike optical systems, radar operates well no matter what the weather, working as it does with microwaves. It measures the speed of one object relative to that of another by means of the Doppler effect, most commonly heard in the changing frequency of a train’s whistle as it approaches and then retreats from you.
You might think that radar would be far easier to use in cars than in airplanes, which after all have been using radar for generations. Cars are slower; they monitor two dimensions, not three; and they look ahead just 200 meters, not kilometers. But don’t forget, the sky is largely empty, reflections are few, and those objects that do appear on a radar screen will certainly be of interest. Here on Earth, though, radar must see through a thousand distractions: Every manhole, every tree, every patch of grass produces reflections.
Solutions that make sense for planes and ships often do not work in cars. The radar sets at airports and on ships typically provide the necessary 360 degrees of coverage by rotating the antenna. But that’s just not practical on a car—the rotating dish would be big and conspicuous, and its moving parts might not last for very long. Then there are the radars that fly high above Earth, in satellites or in airplanes, exploiting synthetic-aperture techniques to provide imagelike representations of stationary elements down below. But that’s no good for a car either, because it doesn’t work up close, and it doesn’t produce images fast enough.
Radar manufacturers thus have had to be very inventive to meet automotive requirements. What has aided them most is the development of compound semiconductors, such as indium gallium arsenide and silicon germanium, which can reach frequencies of 76 gigahertz or higher, making possible sensors that are small enough to fit behind the bumper and yet can distinguish a pedestrian from a car from 100 meters away. What’s more, these frequencies see through rain and snow, provide good resolution over a wide field, and can be updated every 40 to 60 milliseconds, fast enough to keep a close eye on a changing traffic situation.
Illustrations: Andrew Zbihlyj
Research on automotive radar dates back many decades. One of the first examples was the Eureka Prometheus project, which began in 1987 and ran until 1995. It was a collaboration between the University of Munich and a number of car companies, including our own. In 1994, the project’s vehicles traveled around 1,000 km in normal traffic, mainly autonomously, on a multilane motorway near Paris. Then, in the project’s finale, they drove from Munich to Copenhagen. It was nearly a decade before autonomous-vehicle research got its next big boost, with the U.S. Defense Advanced Research Projects Agency’s Grand Challenge competitions in 2004 and 2005, during which autonomous cars faced off in the desert. Then, in 2007, came the DARPA Urban Challenge competition, followed in 2012 by the introduction of Google’s autonomous car.
The first commercial application to come out of the Prometheus project was Daimler’s Distronic adaptive cruise control, which went into production in the Mercedes S-Class in 1998. (Toyota had introduced the first commercial adaptive cruise control system the year before.) The system used one long-range radar and two shorter-range units, all of them mounted in the front of the vehicle. Daimler then developed a succession of driver-assistance systems capable of detecting hazardous situations, issuing an alert, and more recently, automatically intervening to avoid an accident.
For instance, a system Daimler calls Speed Limit Assist, which went into production in 2005, warns the driver about going too fast. Another, dubbed Pre-Safe Brake, introduced the following year, automatically applies the brakes if it determines that there’s a risk of colliding with the vehicle in front.
The next step was to extend such protection from the system’s initial sphere of application, the highway, to urban environments. That’s where the two short-range radars came in, in some Mercedes-Benz cars, in 2009. If a collision threatens, the radars of the Pre-Safe Brake system prime the brakes for immediate use. If the short-range radar determines that a crash simply cannot be avoided, the system applies the brakes some 100 ms before impact, substantially reducing damage to the car and its occupants.
Finally, in 2013, the new S-Class boasted an electronic safety “cocoon” spanning nearly 360 degrees, with both short-range radars and a stereo camera. It’s enough to protect city drivers from just about any threat, even that of a rear collision. In all these systems, drivers are always in the loop: They can overrule the system, at least up to the last fraction of a second before a crash. To make the next step to full autonomy, we have tried to do as much as possible with the kinds of radars, video cameras, and other sensors already on our S-Class cars. Such sensors, by the way, are also carried in standard production vehicles from Audi, BMW, Ford, Lexus, and Volvo.
We increased the number of sensors and improved their arrangement to achieve 360-degree coverage and to see objects in greater detail. We installed two slightly modified long-range radars from Continental Automotive Group at the sides of the front bumpers to provide early detection of vehicles coming from the left or right at intersections. One additional long-range radar monitors the traffic to the rear. Finally, we mounted four short-range radars at the corners of the vehicle. These units are entirely new, having been developed in collaboration with Delphi Automotive to provide improved coverage of the car’s immediate surroundings in crowded settings.
The car must get a precise fix on the location and direction of every object that might collide with it, particularly in dense traffic, in narrow streets, and when facing oncoming traffic. Without such information, Bertha would have hung back timidly at the entrances of roundabouts, waiting perhaps for hours to get a chance to enter.
The beams from these automotive radars are each steered electronically, so they require no rotating antennas or any other moving parts. This way, the system can point the radar in different directions and focus the beam accordingly on objects of interest. We also took advantage of today’s higher radar frequencies, which allow for finer resolution of both range and speed. We were able to put it all together to map the environment with the help of algorithms originally devised for use in laser scanners or for image processing.
To help the radar system distinguish people from lampposts, we took two steps. First, we increased the Doppler sensitivity of the device to the point where our system can determine which of a pedestrian’s two feet is advancing and which is stationary. Second, we improved the Kalman filtering, a method often used to interpret noisy data collected over a period of time. These two refinements are what let Bertha confidently conclude, yes, that’s a pedestrian, moving in this direction and at that speed.
Of course, optical systems can usefully complement the radar. But radar alone works under all weather conditions, provides a full 360 degrees of coverage, and sees up to 200 meters ahead. The stereoscopic cameras in the front, by contrast, span only 56 degrees and can see only 40 meters ahead. Thus, radar must serve as the ultimate backup.
One case where optics work better is in keeping track of the edges of a traffic lane, so that the system can predict the car’s trajectory and keep the car within its lane. Video systems are the first choice, but when the light is glaring or the snow is blinding, the radar system must step in. Radars are color-blind, but they can get reflections from objects at the side of the road, like guard rails or even just loose pieces of gravel. To exploit these reflections and to predict the course of winding roads, we developed algorithms that match clothoids, the special curves used in many roads to avoid sudden lateral accelerations.
The most valuable lessons are those that road testing reveal. Of course, such trials have helped us to solve a lot of everyday problems, but more important, they have shown us what still needs to be solved.
One example: Before Bertha’s test drive, all our interest was focused on tracking moving objects, but now we know we must also deal with things that stay put. Parked cars often hide the approach of a pedestrian at the side of the road, which means the sensors don’t have enough time to predict the pedestrian’s movement and adjust the car’s driving accordingly. Another example of where we can improve is to extend the car’s zone of awareness to include any other autonomous car that may be in front of it. When many cars drive themselves, each of them stands to benefit from what the others are detecting (and planning). Data from all sensors will be shared. Our group has already begun working out how to do that for radar.
While the engineering of fully autonomous vehicles remains out of reach for the moment for commercial car manufacturers, some benefits accrue immediately after incremental improvements in the various enabling technologies.
The beauty of this project is that although full autonomy may lie well in the future, many of the steps toward that goal produce immediate payoffs. For instance, electronic stability control systems, which Mercedes-Benz introduced back in the 1990s, cut U.S. accident rates by 27[PDF] percent for cars and by 67 percent for sport-utility vehicles, according to an analysis by the U.S. National Highway Traffic Safety Administration. Sensors that help drivers react to danger are useful now, not just years from now, when they will be integrated into fully robotic cars.
For the time being, we’re not trying to supplant drivers so much as to relieve them of tedium and protect them against their all-too-human blind spots. Should the car one day become a faithful chauffeur, so much the better. Some of us, however, will always love driving and strive to do it ourselves—perhaps with a little help from time to time.
This article originally appeared in print as “Making Bertha See.”
About the Author
Jürgen Dickmann, Nils Appenrodt and Carsten Brenk work at Daimler, where they work on radar systems for road vehicles.