Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Echodyne Shows Off Its Cognitive Radar for Self-Driving Cars

Echodyne's next-generation cognitive radar system is inspired by human vision

4 min read

EchoDrive cognitive radar system (front of roof rack) installed on Echodyne's test vehicle
The EchoDrive cognitive radar system (front of roof rack) is shown here installed on Echodyne's test vehicle.
Photo: Mark Harris

As a transportation technology journalist, I’ve ridden in a lot of self-driving cars, both with and without safety drivers. A key part of the experience has always been a laptop or screen showing a visualization of other road users and pedestrians, using data from one or more laser-ranging lidar sensors.

Ghostly three-dimensional shapes made of shimmering point clouds appear at the edge of the screen, and are often immediately recognizable as cars, trucks, and people.

At first glance, the screen in Echodyne’s Ford Flex SUV looks like a lidar visualization gone wrong. As we explore the suburban streets of Kirkland, Washington, blurry points and smeary lines move across the display, changing color as they go. They bear little resemblance to the vehicles and cyclists I can see out of the window.

That’s because this car is not using lidar to build up a picture of its surroundings, but a new cognitive radar system called EchoDrive, developed by Echodyne, a Bill Gates-funded startup. Ironically, the fact that I cannot immediately interpret the visualization is because Echodyne’s radar functions more like human vision—jumping around to focus on what’s important—than a lidar’s global view.

Echodyne EchoDrive radar system visualizationEchodyne’s EchoDrive radar system visualization.Photo: Mark Harris

Time for some Sensors 101. Lidars work by using mirrors to direct laser pulses over a wide field of view, allowing a vehicle to detect hazards through a full 360 degrees. As laser light has very short wavelengths, the spatial resolution of lidar is also excellent. However, lidar performance degrades in rain, fog, or snow, and the best units are still very expensive.

Radars are cheap, unaffected by the weather, and can work over long distances. But they suffer from two big problems. With longer wavelengths, radars can struggle to resolve small features, especially at long range. And traditional radars are not easy to direct over wide scenes without bulky mechanical antennas, like the spinning radars on ships.

Thus, existing automotive radars generally have narrow, fixed fields of view, and little ability to discriminate what they are detecting. (Radar has been implicated in several crashes involving Tesla’s Autopilot system).

Echodyne’s innovation is to use scanning arrays, based on metamaterials whose refractive index can be tuned electronically. The radar beam can then be steered across a wide field of view to scan for large obstacles in the road ahead, or focused on a small area to help identify what it has detected. Just being able to steer and task the radar can give an order of magnitude more sensitivity at long range than existing systems, according to Tom Driscoll, Echodyne’s founder and CTO.

“The basic concept is that by pushing and pulling a four-dimensional data cube of azimuth, elevation, range, and Doppler around, you can allocate the overall bound resources of the radar where and when you need them,” he says.

DARPA calls this concept cognitive radar, and has been working on developing it for at least a decade, mostly using high performance (and extremely expensive) phased array radars.

Driscoll makes an analogy with human vision. Although we feel as though we have good vision over a wide field of view in front of us, we actually only have good resolution in the very center of our eye, with our brain focusing that attention to the periphery as needed.

Similarly, the EchoDrive radar can be tasked by the car’s computer to interrogate different parts of the scene. “On an open road, we might have beams tracking identified cars in front of us,” says Driscoll. “Then you could imagine coming to a T-intersection and spending more time looking left, just like a human driver does. When you reach a crosswalk, you’ll use interrogation modes specifically designed to check whether there are pedestrians.”

EchoDrive has another trick up its sleeve. As well as reporting the azimuth (horizontal angle), elevation, and range of an object like a lidar does, radars can also detect its relative velocity, because the frequency of returning signals are Doppler-shifted. Radar pulses might not have the spatial resolution of lidar, but micro-Doppler radar spectra can identify objects by revealing distinctive features like a runner’s moving arms, or the spinning of a bicycle wheel.

As our drive (which was under human control) progressed, I could begin to make out features from the visualization. Because it showed only narrow vertical slices of the scene ahead, a bridge might appear as dots (support pillars), a band (the bridge roadway), or nothing at all (for a slice looking too far up). Pedestrians and even construction cones were visible, and everything was color-coded—red for receding, blue for approaching.

A typical self-driving car might have between three and six such cognitive radar units, working alongside traditional long-range radars, cameras, and lidars, and drawing on a database of high-definition maps. Although cognitive radar might be superior to lidar in some situations, Driscoll sees it as an addition rather than a replacement for the laser units.

“My view is that these vehicles at Level 4 and 5 [capable of driving without human oversight] should have every high-end sensor you can possibly rationalize, at least until the problem’s solved,” he says. “Sensor fusion is about having enough overlap between sensors that you can stitch them together.”

Echodyne has already supplied prototype EchoDrive radars to several companies working on self-driving technologies, and Driscoll believes that a production version should sell for less than US $1,000. That’s more expensive than today’s clumsy collision radars but much cheaper than state-of-the-art lidars.

Laser companies are also experimenting with cheaper lidars enabled by metamaterials, likely destined for Level 2 and 3 driver assistance systems. Even if practical and reliable self-driving cars remain science fiction for years to come, smarter, cheaper sensors like these could make driving safer for everyone in the meantime.

The Conversation (0)