A Little Light Magic

Circuit designers are playing optical tricks to make nanoscale structures on ICs

11 min read

In the electronics industry, Moore's Law has joined death and taxes as one of the certainties of life. This rule, that the number of transistors that can be put on an integrated circuit will double every couple of years or so, is fundamental to the ubiquity of ICs. They're already responsible for toasting our bread, handling our airplane reservations, predicting our weather.

If Moore's Law continues to hold, they will someday make our beds, translate our speech on the fly, and guide our surgeons' hands through delicate operations. But maintaining this frenzied pace means that semiconductor manufacturers are constantly in search of new ways to make transistors ever smaller.

Picture this

ICs are made using optical lithography, a process similar to photographic printing, in which the patterns that will become layers of an integrated circuit are exposed on a semiconductor wafer, one layer at a time [see illustration (PDF)]. As early as the 1980s, experts were already predicting the demise of optical lithography because the wavelength of the light used to project the circuit image onto the silicon wafer was too large to resolve the ever-shrinking details of each new generation of ICs.

But semiconductor equipment makers kept optical lithography alive by steadily decreasing the light wavelength from 365 nm in the 1980s down to 193 nm in the most advanced systems today. These systems are producing ICs with hundreds of millions of transistors whose smallest features measure about 100 nm.

Now, at first glance, it appears that optical lithography has hit another dead end. Wavelengths shorter than 193 nm can't be used without a drastic redesign of lithographic systems because the shorter wavelengths are simply absorbed by the quartz lenses that direct the light onto the wafer.

So is this the end? Not quite. There are a few more tricks that IC manufacturers can play. Lumped together, they are called resolution enhancement techniques (RETs), and in one way or another, they all coax the light into resolving shapes much smaller than its wavelength. The main techniques are optical proximity correction, phase-shifting masks, and modified, or ”off-axis,” illumination.

With these tricks, the existing--and already paid for--optical lithography equipment can create patterns much smaller than the wavelength of the light used to produce them, an accomplishment that would have seemed impossible a scant 10 years ago. In fact, Steve Brueck and his group at the University of New Mexico (Albuquerque) recently showed that these techniques can be easily extended to one-eighth the wavelength of the light--that is, to less than 25 nm for 193-nm light.

Using these techniques, scientists at the Massachusetts Institute of Technology's (MIT's) Lincoln Laboratory, in Lexington, have built prototype transistors with a gate length of only 9 nm--smaller than the smallest virus. Other tricks can also come into play. For example, by immersing the focusing lenses in a liquid with an index of refraction greater than that of air, optical lithography may be extended indefinitely. As Brueck has succinctly put it, ”There is no fundamental limit to optical lithography.” And ever since the 180-nm generation of devices, in 1999, RETs are making advanced optical lithography possible.

The shorter-wavelength alternatives

Lithography experts are already pursuing an alternative: reducing the wavelength of the light below 193 nm. A technique that uses 157-nm light is on the International Technology Roadmap for Semiconductors, published in 2001 by the Semiconductor Industry Association (San Jose, Calif.). The 157-nm light is intended as a tool for ICs with 65-nm features and smaller, due to enter manufacturing in 2007. But its future has been thrown into doubt by the announcement in May by Intel Corp. (Santa Clara, Calif.) that the company plans to get to the 45-nm generation not with 157-nm equipment but rather by continuing to use 193 nm in combination with resolution enhancement.

Another approach is to use soft X-rays, also called extreme ultraviolet (EUV), with a wavelength of 13 nm. EUV needs hardly any resolution enhancement. Unfortunately, this approach trades refractive lenses for relatively complex and experimental reflective lenses, and it's at least seven years away from commercial use. So IC manufacturers will likely continue to use 193-nm systems for the foreseeable future, making resolution enhancement techniques absolutely critical to future progress in semiconductor technology.

To understand how resolution enhancement techniques work, step back and take a fresh look at basic optical lithography [see illustration (PDF) again].

Fundamentally, an IC lithography tool is an astoundingly high-quality projector. Light from a source is collected by a set of mirrors and light pipes, called an illuminator, which also shapes the light. Shaping of the light gives it a desired spatial coherence and intensity (usually uniform within 2 percent across its beam) over a set range of angles of incidence as it falls on a mask. The mask is a quartz plate onto which a pattern of chrome has been deposited. (As many as 30 masks are used to produce the various layers of an IC.) The pattern creates dark and light regions that correspond to physical elements--typically parts of transistors or connections between them--to be reproduced on each chip on the wafer.

A large lens collects the light transmitted through the mask and focuses it on the wafer to create a scaled-down image, typically one-quarter the size of the mask pattern. The sine of the largest angle incident on the wafer is called the numerical aperture (NA) of the lens. The aperture is similar to the f-stop in photography, in that a larger aperture allows higher resolution but reduces the depth of focus--like the depth of field in photography.

To produce each layer of the IC, the wafer is coated with a photosensitive polymer, called a photoresist. These are typically materials that become soluble when exposed to light. After the soluble photoresist is rinsed away, the remaining polymer delineates the mask pattern on the surface of the wafer. The parts of the wafer still covered by the photoresist are protected from the next processing step--which could be etching, dielectric or metal deposition, or ion implantation--to produce the desired electrical characteristics in the silicon. In this way, the transistors and wires of an integrated circuit are assembled layer by layer.

The limits of optical imaging were described over a century ago by Ernst Abbe, a professor at the University of Jena, Germany. He found that the narrowest lines and spaces of a repetitive pattern that an optical system can reproduce are directly proportional to the wavelength of the light that projects the image and inversely proportional to the system's numerical aperture. Thus, the smaller the wavelength and the larger the NA of the lens, the narrower the lines. Abbe's work was augmented by English physicist John William Strutt, the third Baron Rayleigh, and is now one of the Rayleigh imaging equations.

The value of the proportionality constant, called k1, is determined by many factors, including the coherence of the illumination, antireflective coatings that may be present above or below the photoresist, and the resist's parameters themselves (such as baking times and diffusion characteristics). A process with a k1 of 0.8 is considered ”easy” to accomplish; a process with k1 smaller than 0.5 is very hard, requiring extensive application of RETs.

The difficulty comes down to two basic metrics: the depth of focus of the lens system and the exposure latitude, which is the maximum variation in the amount of light that strikes the wafer without changing the position of an edge formed in the image. Good exposure latitude occurs when the optical system is able to create images with very high contrast.

A depth of focus greater than 500 nm is highly desired, since wafers aren't perfectly flat and the photoresist usually has a thickness of several hundred nanometers. However, the depth of focus is inversely proportional to the square of the numerical aperture, so increasing the aperture to improve resolution quickly decreases the depth of focus. The combination of focus and exposure allowances is generally called the process window.

Traditional strategies to improve resolution in optical lithography have been to change the major variables: make the wavelength smaller, as discussed earlier, and increase the numerical aperture, at the expense of a smaller depth of focus. Some of the earliest optical exposure systems used one of the emission lines of mercury vapor, first at 436 nm and later at 365 nm. Today, typical semiconductor manufacturing processes use 248-nm laser light emitted by a krypton-fluoride laser. More advanced processes use 193-nm light from argon-fluoride lasers, and a laser with fluorine alone can emit 157-nm light.

To increase the numerical aperture for better resolution, lens designers have gone back to the drawing board and are creating lenses with larger diameters and more lens elements to collect more of the light emerging from the mask. When introduced in 1977, the first reduction-system lenses had a numerical aperture of 0.2. Ten years later, the typical value was 0.45, and now 0.63 is common. The newest lithography systems to be announced have apertures of 0.85, and rumors abound that 0.95 may soon be achieved in a commercial system.

A numerical aperture at the theoretical maximum of 1 is possible only with an infinitely large lens, collecting 100 percent of the light from the mask. But lithography companies are now exploiting a technique long used by microscopists: filling the space between the lens and the wafer with a fluid with a high index of refraction. (The index of refraction of a material is the ratio of the speed of light in a vacuum to that in the material.)

The larger refractive index allows the developers to attain the seemingly impossible: a numerical-aperture equivalent of an air-based system having a numerical aperture greater than 1. Recent demonstrations of immersion lithography at Nikon Corp. (Tokyo, Japan) have achieved 65-nm imaging using a 193-nm system with immersion in water (with a refractive index of 1.44 at 193 nm), which gives an air-equivalent aperture of 1.25.

To achieve smaller wavelengths and higher numerical apertures, the imaging hardware has to be redesigned and the old equipment reengineered. When lithographic tools cost only US $500 000, it was easier to contemplate replacing them. But the current price runs to over $15 million. So there's a strong economic incentive to make the installed equipment do the best it can by reducing k1 using resolution enhancement, while the wavelength and numerical aperture are unchanged.

To see how resolution enhancement techniques can reduce k1, consider the basic properties of any electromagnetic wave: its amplitude and phase and its propagation direction. These parameters are three ”handles” to manipulate and improve the imaging process. Each of the three main approaches controls and manipulates one of the light handles at the mask in a process known as wavefront engineering. Optical proximity correction is targeted at wavefront amplitude, phase-shift masks at wavefront phase, and off-axis illumination at wavefront direction.

The natural place to do wavefront engineering in a lithography system is at the mask because it is an element that is changed every time a different layer is printed, while the rest of the lithographic system is kept constant. Complicated software programs alter the IC design data to counteract the distortions of the image as the light passes through the mask and strikes the wafer. The modified data is then used to make the mask.

Optical proximity correction can counteract many of the imaging effects that now distort patterns on the wafer. One typical effect is a variation in the dimensions of identical features depending on whether they are near many other structures or are in a relatively isolated part of the IC. Lines on the wafer may be shorter than those on the mask, because more light strikes the photoresist at the ends of the lines. On the other hand, the inner corners of L-shaped structures become wider, because less light strikes there.

To counteract this effect, the amplitude of the lightwave is adjusted by changing the sizes of the openings in the mask, allowing more or less light through. Small jogs, or local changes, are made along the length of the line and at its ends to increase or reduce the exposure [see ”Squaring the Corners” (PDF)]. Adding subresolution features--very small features whose dimensions are below the resolution of the imaging system, also known as assist features--near an isolated feature can change the shape of the light pattern to match that of densely packed features.

Another effect occurs in regions of the wafer that are to be covered by a regular array of lines. It's due to the fact that the chrome pattern on the mask acts like a diffraction grating, an array of fine, parallel, equally spaced grooves that concentrate reflected or transmitted light in discrete directions. Thus, the diffracted light passing through one line in the wafer pattern creates a series of images on the wafer that overlap the image from a neighboring line. This effect can be reduced by phase shifting: changing the phase of the light coming from two adjacent lines on the mask in such a way that they're out of phase and cancel each other out in regions where no light is wanted.

Phase changes can be created by etching away areas of the quartz so that the mask has different thicknesses in various places [see ”Boosting Image Contrast (PDF)”]. If just enough quartz is etched away from one of two neighboring openings in the chrome pattern, light passing through the two openings will be out of phase by 180 degrees. When the images of these openings are projected onto the wafer, dark interference bands occur where the light from one slit is out of phase with the light from the other. The bands can increase the contrast of the final image, making for sharper edges of the photoresist.

The most common phase-shift mask is the attenuated phase-shift mask, in which some light is allowed to pass through the dark areas with a 180-degree phase shift, altering both the amplitude and the phase. This technique has proven especially effective for forming high-contrast images of contact holes.

The third method of improving resolution, off-axis illumination, also works well for dense areas of repetitive patterns. This approach changes the angle at which light passes through the mask by inserting special holographic optical elements into the illumination system. These elements shape the light into a particular geometric pattern--for example, an annulus (outer ring), dipole (two openings), or quadrupole (four openings) [see inset in "Sharpening Repetitive Patterns" (PDF)] -- and allows light to fall on the original mask only at particular angles [ see main figure (PDF)].

Light that passes through the periodic lines on the circuit mask is diffracted at an angle that depends on the spacing between the lines. When the angle of illumination and the angle of diffraction are well matched, the amount of light diffracted can be enhanced and the contrast of the image improved.

The next wave of development may capitalize on the ever-increasing computing power available for simulation

For masks with large repetitive patterns, like those for dynamic RAM, off-axis illumination can be a very effective technique for enhancing contrast in the photoresist. As always, there's a tradeoff. Features with spacings that don't match the diffraction angle of the light can be reproduced less effectively and may have diminished contrast in the image.

Although resolution enhancement techniques were developed individually, combinations are, of course, possible and are used widely, depending on the details of the layout pattern. Optical proximity correction (OPC) with assist features and off-axis illumination has proven to be a particularly powerful combination, while phase-shift masks generally also require some form of OPC to produce the best process window.

Coming into its own

The concepts used in resolution enhancement aren't new. Altering the features on a mask to compensate for process effects has been employed since projection lithography was first introduced, and papers from Bahaa Saleh's group at the University of Wisconsin, Madison, in the early 1980s outlined almost all of the fundamental theory of what we now call OPC.

Likewise, phase-shift masks have been used for holography since the 1960s but were first demonstrated in a lithography system by Marc Levenson at IBM Corp. in the early 1980s. Similarly, off-axis imaging has been common in microscopy since Abbe's time and was first proposed for lithographic enhancement by several research groups in the late 1980s.

But an explosion of research into these topics occurred around the world in the early 1990s, sparked by the fear that the transition from lithography equipment that used 365-nm light to that using 248 nm would be delayed due to the absence of a robust resist process for the shorter wavelength. All the major enhancement techniques were explored as possible solutions to maintain the rate of feature size reduction predicted by Moore's Law.

But then the resist manufacturers solved their problems and the risk of adopting 248-nm technology diminished. Resolution enhancement entered a phase of hibernation.

In 1998, a transition in lithography occurred. Despite a potential wavelength reduction from 248 to 193 nm, feature sizes for the first time became smaller than the wavelength of the exposing light. The transition to subwavelength lithography meant that some form of RET would be needed for all future generations of ICs manufactured with optical lithography.

Resolution enhancement entered mainstream production for one to three mask levels with the 180-nm generation, and increasingly complex combinations of enhancement techniques have been used for each subsequent generation. Typically, 12 of the roughly 30 masks in a typical 90-nm IC now require some form of resolution enhancement, and the proportion is only expected to grow for each new generation.

These tricks of the trade have found a comfortable home in the software used for the final verification step of the IC design process. Here, the design data is checked for reliability and prepared for mask fabrication. The silicon process simulators that run inside today's verification tools have the power to make the billions of calculations that determine precisely how to modify the design data to achieve the best resolution, making it possible for enhancement techniques to be an integral part of IC design.

The next wave of development may not so much expand on the original three handles of the lithography process--amplitude, phase, and direction--but capitalize on the ever-increasing computing power becoming available for simulation. It would include yield prediction and optimization for multiple variables, even electrical performance, not simply optical imaging fidelity.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions