Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Will Camera Startup Light Give Autonomous Vehicles Better Vision than Lidar?

Multiview-camera creator Light pivots from smart phones to smart cars

3 min read

A mid-sized white-and-gray car with an equipment rack mounted on top. A logo spelling out "Light" is painted on the side.
Photo: Light

In 2013, Rajiv Laroia and Dave Grannan started Light, a company that aimed to disrupt the tiny camera market. The ultimate goal was to provide designs, circuitry, and software to mobile device manufacturers so that smartphones could capture images at qualities that rivaled those taken with bulky, expensive, professional camera lenses. But it turns out the best use of Light’s technology might not be taking better snapshots, but in helping cars see better. 

The technology is built around using an array of inexpensive lenses of varying focal lengths and advanced digital signal processing. Light showcased its approach by
releasing a standalone camera—the 16-lens L16—in 2017, and sold out of its initial production run; the number of units was never made public. 

Part of the magic of Light’s images is the ability to select or change the point of focus after the fact. The multiple camera modules, set slightly apart, also mean that Light cameras can determine a depth value for each pixel in the scene allowing the software to create a three-dimensional map of the objects in the image.

This ability to create a depth map meant that the consumer-targeted L16 got attention from businesses interested in things other than pretty pictures. A rental car company in Germany set up an array of cameras to inspect cars being dropped off for damage; Wayfair experimented with Light’s technology to place furniture within images of rooms. And Light CEO Grannan says the company always hoped to use computational imaging for machine vision as well as consumer cameras.

Nokia 9 PureViewPhoto: Nokia

Still, Light continued to focus on going after consumers looking to take better pictures with small devices. And in 2019, the first cellphone camera using Light’s technology hit the market, the five-module Nokia 9 PureView. It didn’t exactly take the world by storm.

“I think our timing was bad,” Grannan says. “In 2019 smartphone sales started to shrink, the product became commoditized, and there was a shift from differentiating on quality and features to competing on price. We had a premium solution, involving extra cameras and an ASIC, and that was not going to work in that environment.”

Fortunately, thanks to Softbank CEO Masayoshi Son, another door had recently opened.

In 2018, looking for additional capital, Light had approached the Softbank Vision Fund about joining its Series D round of investment, a US $121 million infusion that brought total investment in the company to more than $185 million. Son suggested that, in his view, because Light’s technology could effectively allow cameras to see in three dimensions, it could challenge Lidar—which uses pulses of laser light to gauge distances—in the autonomous car market.

“We were intrigued by that,” Grannan says. “We had envisioned multiple vertical markets, but auto wasn’t one of them.”

But, he says, though co-founder and CTO Laroia realized it was theoretically possible to scale the abilities of the company’s depth mapping technology so it could work over the hundreds of meters of range needed by autonomous vehicles, he wasn’t sure it was practical, given the vibration and other challenges a vehicle in motion encounters. Laroia spent about three months convincing himself that the system could be calibrated and the algorithms would work when cameras were separated far enough to make the longer range possible.

With that question settled, Light began an R&D program in early 2019 to further refine its algorithms for use in autonomous vehicles. In mid-2019, with the consumer phone and camera market looking sour, the company announced that it was getting out of that business, and pivoted the entire company to focus on sensing systems for autonomous vehicles.

“We can cover a longer range—up to 1000 meters, compared with 200 or so for lidar,” says Grannan. “The systems we are building can cost a few thousand dollars instead of tens of thousands of dollars. And our systems use less power, a key feature for electric vehicles.”

At the moment, Light’s engineers are testing the first complete prototypes of the system, using an unmarked white van that they are driving around the San Francisco Bay Area. The tests involve different numbers of cameras in the array, with a variety of focal lengths, as the engineers optimize the design. Computation happens on a field-programmable gate array; Light’s engineers expect to have moved the circuitry to an ASIC by early 2021.

The prototype is still in stealth. Grannan says the company expects to unveil it and announce partnerships with autonomous vehicle developers later this year.

The Conversation (0)