Can Israeli Start-up Oryx Oust Lidar From Self-Driving Cars?

Will a technology that splits the difference between radar and lidar provide the best range and the highest resolution at the cheapest price?
Photo: iStockphoto

Lidar has been the best thing to happen to self-driving cars—and also the worst. Installing a bank of lasers on the roof means a car can capture millions of points of information every second, rapidly building up a 3D image of the world around it.

The problem is that until recently, lidar units were more expensive than the cars that carried them. High performance units on Google’s early vehicles cost $70,000; devices with shorter range and a narrower field of view, from lidar pioneer Velodyne, can still cost thousands.

There is always the option of going without lidar altogether, as Tesla has done. Its cars rely on cheaper radar, video, and ultrasonic sensors. But the fatal crash of a Model S this summer while its Autopilot system was engaged shows that this solution is far from ideal.

What autonomous car makers really want is a dirt cheap and utterly reliable sensor that complements radar and video cameras. And Israeli start-up Oryx Vision thinks it might have just what they’re looking for.

Oryx’s technology, coherent optical radar, splits the difference between radar and lidar. Like a lidar, it uses a laser to illuminate the road ahead, but like a radar it treats the reflected signal as a wave rather than a particle.

The laser in question is a long-wave infrared laser, also called a terahertz laser because of the frequency at which it operates. Because human eyes cannot focus light at this frequency, Oryx can use higher power levels than can lidar designers. Long-wave infrared light is also poorly absorbed by water and represents only a tiny fraction of the solar radiation that reaches Earth. This means the system should not be blinded by fog or direct sunlight, as lidar systems and cameras can be.

One of the potential cost savings of Oryx’s technology comes from the fact that its laser does not need to be steered with mechanical mirrors or a phased array in order to capture a scene. Simple optics spread the laser beam to illuminate a wide swathe in front of the vehicle. (Oryx would not say what the system’s field of view will be, but if it is not 360 degrees like rooftop lidars, a car will need multiple Oryx units facing in different directions.)

The clever bit—and what has prevented anyone from building a terahertz lidar before now—is what happens when the light bounces back to the sensor. A second set of optics direct the incoming light onto a large number of microscopic rectifying nanoantennas. These are what Oryx’s co-founder, David Ben-Bassat, has spent the past six years developing.

Incoming light creates an AC response in the antenna that is rectified—in other words, converted into a DC signal. Rani Wellingstein, Oryx’s other founder, says the system has a million times the sensitivity of traditional lidar. Because the antennas treat incoming light as a wave, they can also detect Doppler shift—the change in frequency due to the relative motion of whatever it bounced off—and thus determine the velocities of other objects in or near the roadway.

Each nanoantenna is just five square micrometers; they will ultimately be fabricated directly onto integrated circuits using a thin-film chip manufacturing processes. This will make it fairly simple for the signals to be fed into a machine learning system that can classify objects in the scene.

While Oryx has already built many millions of experimental nanoantennas, it has yet to fabricate them into an integrated circuit. Within a year, it intends to build a 300-pixel demonstration unit, then a 10,000-pixel chip containing a quarter of a million nanoantennas, and finally a 100,000-pixel, multi-million-nanoantenna device suitable for in-car use.

“Today, radars can see to 150- or 200 meters, but they don’t have enough resolution. Lidar provides great resolution but is limited in range to about 60 meters, and to as little as 30 meters in direct sunlight,” says Wellingstein. He expects Oryx’s coherent optical radar to accurately locate debris in the road at 60 meters, pedestrians at 100 meters, and motorcycles at 150 meters—significantly better than the performance of today’s sensor systems.

The Oryx setup could be cheap, too. If the company can get its (still unproven) fabrication process to scale, it thinks making millions of nanoantennas will be little harder than producing conventional semiconductor chips. The company recently announced a $17 million Series A funding round and has already been talking with autonomous vehicle companies, including Nutonomy, which recently launched a pilot self-driving taxi service in Singapore.

“The idea seems reasonable, but I'd be skeptical until they have a working prototype,” says Joe Funke, an engineer who has worked on autonomous systems at several vehicle start-ups, including Lucid Motors. “There will always be use cases for cheaper, high range sensors,” says Funke. “If nothing else, longer range enables higher operating speeds.”

But don’t write off lidar just yet. Following a $150 million investment by Ford and Chinese search giant Baidu this summer, Velodyne expects an “exponential increase in lidar sensor deployments.” The cost of the technology is still dropping, with Velodyne estimating that one of its newer small-form lidars will cost just $500 when made at scale.

Solid-state lidars are also on the horizon, with manufacturers including Quanergy, Innoluce, and another Israeli start-up, Innoviz, hinting at sub-$100 devices. Osram recently said that it could have a solid-state automotive lidar on the market as soon as 2018, with an eventual target price of under $50. Its credit card–size lidar may be able to detect pedestrians at distances up to 70 meters.

That’s not quite as good as Oryx promises, but is probably fine for all but the very fastest highway driving. The window for disrupting lidar’s grip on autonomous vehicles is closing fast.

This story was corrected to give proper details of Oryx nanoantenna integration plans.


Cars That Think

IEEE Spectrum’s blog about the sensors, software, and systems that are making cars smarter, more entertaining, and ultimately, autonomous.
Contact us:

Senior Editor
Philip E. Ross
New York City
Assistant Editor
Willie D. Jones
New York City
Senior Writer
Evan Ackerman
Berkeley, Calif.
Lucas Laursen

Newsletter Sign Up

Sign up for the Cars That Think newsletter and get biweekly updates, all delivered directly to your inbox.