Semiconductors

MIT Spinoff Building New Solid-State Lidar-on-a-Chip System

Our team at Kyber Photonics is developing a lidar chip to enable the next generation of autonomous robots and vehicles

Photonic integrated circuits for a lidar beam steering system
Photonic integrated circuits for a lens-based beam-steering lidar architecture.
Image: MIT Lincoln Laboratory

To enable safe and affordable autonomous vehicles, the automotive industry needs lidar systems that are around the size of a wallet, cost one hundred dollars, and can see targets at long distances with high resolution. With the support of DARPA, our team at Kyber Photonics, in Lexington, Mass., is advancing the next generation of lidar sensors by developing a new solid-state, lidar-on-a-chip architecture that was recently demonstrated at MIT. The technology has an extremely wide field of view, a simplified control approach compared to the state-of-the-art designs, and has the promise to scale to millions of units via the wafer-scale fabrication methods of the integrated photonics industry.

Light detection and ranging (lidar) sensors hold great promise for allowing autonomous machines to see and navigate the world with very high precision. But current technology suffers from several drawbacks that need to be addressed before widespread adoption can occur. Lidar sensors provide spatial information by scanning an optical beam, typically in the wavelength range between 850 and 1550 nm, and using the reflected optical signals to build a three-dimensional map of an area of interest. They complement cameras and radar by providing high resolution and unambiguous ranging and velocity information under both daytime and nighttime conditions.

This type of information is critical because it allows an autonomous vehicle to detect and safely navigate around other vehicles, cyclists, pedestrians, and any potentially dangerous obstacles on the road. However, before lidar can be widely used by autonomous vehicles and robots, lidar units need to be manufacturable at large volumes, have high performance, and cost two orders of magnitude less than current commercial systems, which cost several thousands of dollars apiece. Although there has been major progress in the industry over the past 10 years, there is yet to be a lidar sensor that can meet all of the industry needs simultaneously.

The challenges facing the lidar industry

The fully autonomous vehicles application space highlights the performance and development challenges facing the lidar industry. These requirements include cost on the order of US $100 per unit, ranging greater than 200 meters at 10 percent object reflectivity, a minimum field of view (FOV) of 120° horizontal by 20° vertical, 0.1° angular resolution, at least 10 frames per second with approximately 100,000 pixels per frame, and scalable manufacturing (millions of units per year).

Any viable lidar solution must also demonstrate multi-year reliability under the guidelines established by the Automotive Electronics Council, which is the global coalition of automotive electronics suppliers. While many commercial lidar systems can meet some of these criteria, none have met all of the requirements to date. Our team at Kyber Photonics, which we founded in early 2020, aims to solve these challenges through a new integrated photonic design that we anticipate will meet these performance targets and leverage the manufacturing infrastructure built for the integrated circuits industry to meet volume and cost requirements.

To understand the benefits of our approach, it is first useful to survey currently available technology. To date, autonomous vehicles and systems have relied primarily on lidar that uses moving components to steer an optical beam. The most commonly used lidar designs comprise an array of lasers, optics, electronics, and detectors on a mechanically rotating stage. The need to assemble and align all of these parts lead to high costs and relatively low manufacturing volumes, and the wear and tear on the mechanical components raise questions about long-term reliability.

While these lidar systems, produced on scales of tens of thousands of units, have allowed the field of autonomous vehicles to make headway, they are not suitable for ubiquitous deployment of lidar. For these reasons, there is a big push towards eliminating mechanical components and moving towards compact designs that are more reliable and can be manufactured at larger volumes and a lower unit cost.

Figure 1. Photonic integrated circuits for lidar fabricated using MIT Lincoln Laboratory’s 90-nm silicon fabrication toolset. Each die has different test designs of the planar lens solid-state beam steering technology.Image: MIT Lincoln Laboratory

Seeking a solid-state lidar-on-a-chip design

There are two concepts in lidar design that address the reliability and scalability challenges for this technology. First, “solid-state” describes a lidar sensor that has eliminated the mechanical modes of failure by using no moving parts. Second, “lidar-on-a-chip” describes a system with the lasers, electronics, detectors, and optical beam-steering mechanism all integrated onto semiconductor chips. Lidar-on-a-chip systems go beyond solid-state because these designs attempt to further reduce the size, weight, and power requirements by integrating all of the key components onto the smallest footprint possible.

This system miniaturization reduces the complexity of the assembly and enables high volume throughput that can result in 10 to 100x cost reduction at scale. This is all possible because lidar-on-a-chip architectures can fully leverage CMOS compatible materials and wafer-scale fabrication methods established by the semiconductor industry. Once a final solution is proven, there is anticipation that, just like with the integrated circuits inside computers and smartphones, hundreds of millions of lidar units could be manufactured every year.

To realize the vision of ubiquitous lidar, there have been several emerging approaches to solid-state lidar and lidar-on-a-chip design. Some of the key approaches include:

Microelectromechanical Systems (MEMs)

  • This approach uses an assembly of optics and a millimeter-scale deflecting mirror for free-space beam steering of a laser or laser array. Although MEMS mirrors are fabricated using semiconductor wafer manufacturing, the architecture itself is not a fully integrated lidar-on-a-chip system since optical alignment is still needed between lasers, MEMs mirrors, and optics.
  • One of the trade-offs in this design is between scanning speed and maximum steering angle for aperture/mirror size, which directly relates to the detection range. This trade-off exists because a MEMS mirror acts like a spring, so as its size and mass increases, its resonant frequency decreases and limits scanning speed. 

Liquid-Crystal Metasurfaces (LCM)

  • Similar architecture to a MEMS system but with the MEMS mirror replaced by a liquid crystal metasurface (LCM). LCMs use an array of subwavelength scatterers mixed with liquid crystals to impart a tunable phase front onto a reflected laser beam, which allows for beam steering.
  • LCM designs have large optical apertures and a wide FOV but current modules are limited to 150-meter ranging. Like MEMs lidar systems, the LCM steering mechanism is built using a semiconductor wafer manufacturing process, while the system is not fully on-chip since the laser input needs to be aligned at an angle to the surface of the LCM.  

Optical Phased Arrays (OPA)

  • This approach is compatible with cheap, high volume manufacturing of the optical portion of the system. Uses an array of optical antennas, each requiring electronic phase control, to generate constructive interference patterns that form a steerable optical beam. However, as the size of the emitting aperture increases, electronic complexity increases and limits scaling since it requires thousands of active phase shifters controlled simultaneously. 
  • The size of optical waveguides places practical fabrication constraints on the ideal antenna spacing (half wavelength) and therefore degrades the optical beam, an effect called aliasing, which limits the usable horizontal FOV. More recent designs use aperiodic waveguide spacing and different waveguide cross-sections to solve these issues, but each creates trade-offs such as limitations on the vertical scanning and the reduction of optical power in the main beam.

Each one of these approaches has shown an improvement over conventional mechanical-based lidar in various metrics such as range, reliability, and cost, but it is still unclear whether they can hit all of the requirements for fully autonomous vehicles. 

A novel beam-steering lidar architecture

Recognizing these challenges, the Photonic and Modern Electro-Magnetics Group at MIT and MIT Lincoln Laboratory researchers collaborated to develop an alternative solution to solid-state optical beam steering that could solve the remaining challenges for lidar. A point of inspiration was the Rotman lens, invented in the 1960s, to enable passive beam-forming networks in the microwave regime without the need for active electronic phase control. The Rotman lens receives microwave radiation from multiple inputs along an outer focal arc of the lens. The lens spreads the microwaves along an inner focal arc where an array of delay lines imparts a phase shift to the microwaves to form a collimated beam.

While much of the intuition from the microwave regime translates to the optical and near-infrared domain, the length scales and available materials require a different lens design and system architecture. Over the past three years, the MIT-MIT Lincoln Laboratory team has designed, fabricated, and experimentally demonstrated a new solid-state beam steering architecture that works in the near-infrared.

Figure 2. Schematic of the planar lens-based beam steering architecture: A near-infrared laser is fiber-coupled onto the chip. The laser light is routed through an optical switch matrix formed by a tree of Mach-Zehnder Interferometer (MZI) switches. The light is then fed into a planar lens. The planar lens both collimates and steers the light where it is then scattered out-of-plane by a grating.Image: MIT, Dr. Scott Skirlo

This novel beam steering architecture includes an on-chip planar lens for steering in the horizontal direction and a grating to scan in the vertical direction, as shown in Figure 2. The laser light is edge-coupled into an optical waveguide, which guides and confines light along a desired optical path. This initial waveguide feeds into a switch matrix that expands into a tree pattern. This switch matrix can comprise hundreds of photonic waveguides, yet requires only about 10 active phase shifters to route the light into the desired waveguide. Each waveguide provides a distinct path that feeds into the planar lens and corresponds to one beam in free space. This planar lens consists of a stack of CMOS compatible materials tens of nanometers thick and is designed to collimate the light fed in by each waveguide. The collimation creates a flat wavefront for the light and allows it to minimize spreading and therefore reduces loss as it propagates over long distances.

Since each waveguide is designed to feed the lens at a unique angle, selecting a particular waveguide will make the light enter the lens and exit at a specific angle as seen in Figure 3. The system is designed such that the beams emitted from adjacent waveguides overlap in the far-field. In this way, the entire horizontal field-of-view is covered by beams emitted from the discrete number of waveguides. For instance, a 100° FOV lidar system with a 0.1° resolution can be designed using 1000 equally spaced waveguides along the focal arc of the lens and by using a sufficiently large aperture to generate the required beam width and overlap. This approach allows the design to meet the required FOV and resolution requirements of lidar applications.

Figure 3. Electromagnetic simulations of the planar lens-based beam steering design. The simulations are two-dimensional cross-sections in the plane of the chip. (a) Waveguide feeds the lens at an angle orthogonal to the lens, forming a collimated beam that is emitted on-axis. (b) Waveguide feeds the lens off-axis, forming a collimated beam that is steered at an angle.Image: MIT, Dr. Scott Skirlo

To control the vertical angle of the emitted beam, our design uses a wavelength selective component called a grating coupler. The grating is a periodic structure composed of straight or curved rulings that repeat every few hundred nanometers. The interface of each repeated ruling causes light to diffract and thus scatter out of the plane of the chip. Since the emission angle of the grating depends on the wavelength of light, tuning the laser frequency controls the angle at which the beam is emitted into free space. As a result, selecting the waveguide via the switch matrix steers the beam in the plane of the chip (horizontally), and tuning the wavelength of the laser steers the beam out of the plane of the chip (vertically).

An alternative to time-of-flight detection

How does our planar lens architecture address key remaining challenges of optical beam steering for lidar applications? Like optical phased arrays (OPAs), the architecture takes advantage of CMOS-compatible fabrication techniques and materials used in recently established photonic foundries. Compared to OPAs, our lens-based design can have up to twice the horizontal FOV and also reduces electronic complexity by two orders of magnitude because we do not need hundreds of simultaneously active phase shifters.

Since the number of active phase shifters scales as the binary logarithm of the number of waveguides, log base 2, our system requires approximately 10 phase shifters to select the waveguide input and steer the optical beam. To enable a full lidar system, the lens-based approach can also take advantage of similar lasers and detection schemes that have been demonstrated with OPAs and other solid-state beam steering approaches. Combining all of these benefits enables a lidar-on-a-chip design that can lead to a low-cost sensor producible at high manufacturing volumes.

To address the range requirement of 200 meters, we are designing our system for a coherent detection scheme called frequency modulated continuous wave (FMCW) rather than the more traditionally used time-of-flight (ToF) detection. FMCW has advantages in terms of the instantaneous velocity information it can provide and the lower susceptibility to interference from other light sources. Note that FMCW has been successfully demonstrated by several lidar companies with ranges beyond 300 meters, though those companies have yet to fully prove that their systems can be produced at low cost and high volumes.

Rather than emitting pulses of light, FMCW lidars emit a continuous stream of light with a modulated frequency laser. Before being transmitted, a small fraction of the beam is split off, kept on the chip, and recombined with the main beam after it has been transmitted, reflected by an object, and received. The laser light is coherent, so these two beams can constructively and destructively interfere at the detector, generating a beat frequency waveform that is detected electronically and contains both ranging and velocity information. Since a beat signal is only generated by coherent light sources, it makes FMCW lidars less sensitive to interference from sunlight and other lidars. Moreover, the detection process also suppresses noise from other frequency bands, giving FMCW a sensitivity advantage over ToF.

Figure 4. Optical images of the lidar-on-a-chip elements: (a) optical switches, (b) lens and grating, and (c) 32-port beam-steering design with a switch tree, lens, and grating.Image: MIT Lincoln Laboratory

Proof of concept and next steps

To demonstrate the viability of the optical beam steering, a proof-of-concept chip was fabricated using the silicon nitride integrated photonic platform at MIT Lincoln Laboratory. Optical microscope images of the fabricated chips are shown in Figure 4. The planar lens beam-steering was tested using a fiber-coupled tunable laser with a wavelength range between 1500-1600 nm. This testing confirmed that the chip was capable of providing a 40° (horizontal) by 12° (vertical) FOV, seen in Figure 5. Recently, we have improved our designs to enable a 160° (horizontal) by 20° (vertical) FOV so that we can better address the needs of lidar applications. These fabricated chips are undergoing testing to verify the expected beam-steering performance.

Images: MIT Lincoln Laboratory, Josué J. López
Figure 5. Animation consisting of compiled infrared images showing the operation of the planar lens beam-steering chip: (a) beam-steering via port selection and (b) beam-steering via wavelength tuning. Captured with an infrared camera.

As Kyber Photonics moves forward with development, we are expanding from optical beam-steering demonstrations into the engineering of a complete lidar-on-a-chip system. The planar lens architecture will be tested in combination with the necessary frequency-modulated laser, detectors, and electronics to demonstrate ranging using FMCW detection.

Kyber Photonics will start by building a compact demonstration that uses a separately packaged laser that is fiber-coupled onto our chip and then move towards an on-chip laser solution. Many photonic foundries are adding lasers to their process development kits, so we plan to integrate and refine the on-chip laser solution at different stages of product development.

Subsequent steps will also include integrating the necessary application-specific integrated circuits to handle the photonic drivers, laser modulation, and receive signal processing. Our end goal is that within the next two to three years, Kyber Photonics will have a lidar-on-a-chip unit that is the size of a wallet and has a clear production path to meeting the demands of low cost, high reliability, high performance, and scalability for the autonomous vehicle industry and beyond.

About the Authors

Josué J. López is CEO and co-founder of Kyber Photonics and a 2020 Fellow of Activate, a two-year fellowship supported by DARPA that helps scientists bring their innovations to the marketplace.

Thomas Mahony is CTO and co-founder of Kyber Photonics and a 2020 Activate Fellow. 

Samuel Kim is co-founder of Kyber Photonics.

The authors would like to acknowledge key inventors and contributors to the technology described in this article. They include members of the Soljačić Group at MIT: Dr. Scott Skirlo, Jamison Sloan, and Professor Marin Soljačić; and MIT Lincoln Laboratory researchers: Dr. Cheryl Sorace-Agaskar, Dr. Paul Juodawlkis, and members of the Advanced Technology Division at Lincoln Laboratory.