This is part of IEEE Spectrum’s special report: Top 11 Technologies of the Decade
Back in the 20th century, just about the only LED you normally saw was the one that lit up when your stereo was on. By the noughties, tiny light-emitting diodes were also illuminating the display and keypads of your mobile phone. Now they are backlighting your netbook screen, and soon they’ll replace the incandescent and compact fluorescent lightbulbs in your home.
This revolution in lighting comes from the ever-greater bang the LED delivers per buck. With every decade since 1970, when the red LEDs hit their stride, they have gotten 20 times as bright and 90 percent cheaper per watt; the relation is known as Haitz’s Law, and it applies also to yellow and blue LEDs, which were commercialized much later.
The forerunners of the white LEDs that are now going into lightbulbs were the chips that backlit handsets starting about a decade ago. Back then, they used tens of milliamps and consumed a watt for every 10 lumens of light they produced. They were also tiny—just 300 micrometers on a side. Since then, the chips have more than tripled in size, to a millimeter square or more, current has shot up to an ampere or so, and efficiency has rocketed to around 100 lm/W. They now have everything they need to dominate lighting, except for a low enough price. But that, too, will soon come.
Even now, white LEDs are competitive wherever replacing a burned-out lamp is inconvenient, such as in the high ceilings and twisty staircases of Buckingham Palace, because LEDs last 25 times as long as Edison’s bulbs. They have a 150 percent edge in longevity over compact fluorescent lights, and unlike CFLs, LEDs contain no toxic mercury. That means it isn’t a pain to dispose of them, and you don’t have to worry that your house has become a hazard zone if one breaks.
Making these white-emitting chips bigger and driving them harder has been quite easy; it was increasing the efficiency that required a radical redesign of the device’s architecture. To produce the first generation of white LEDs, engineers would deposit a stack of carefully chosen gallium nitride and indium gallium nitride layers on a semitransparent substrate to yield blue-emitting devices; then they’d add a yellow-emitting phosphor on top to turn the output white. However, this design traps a lot of light within the chip and sends another fraction in the wrong direction, through the substrate.
To address both weaknesses, engineers coated the nitride film—a combination of GaN and InGaN layers—with a metal that acts as a mirror, then flipped the assembly over, removed the substrate, and roughened the underlying surface. In the resulting chip, because most of the rays impinge on the textured top surface at a shallow enough angle to avoid reflecting back, nearly all the light can get through to the world outside.
Europe’s leading LED manufacturer, Osram Opto Semiconductors, in Germany, and the two U.S. LED giants Cree and Philips Lumileds are all using variations of this approach. Japan’s Nichia, the world’s biggest LED manufacturer, has a different way of doing things. Its engineers also roughen the top surface, but they do this by etching a hexagonal pattern into the substrate, which they do not subsequently remove from the gallium nitride film.
These second-generation white LEDs hit the market three or four years ago. Since then interest has rocketed: “If you go to any [lighting] show now, they might as well be called the LED show,” says Rick Hamburger, director of segment lighting at Philips Lumileds.
Commercial success followed. White LEDs now illuminate parking lots, streets, and civic buildings. Exactly when they make it into most homes will depend on the price. I just bought a really high-quality, warm-white LED bulb in the United Kingdom from Philips for about US $55; my lamp consumes just 7W while emitting as much light as a 40-W incandescent. (Products that give off a harsher, blue-tinged light go for as little as $10.) I calculate that if I use it for 4 or 5 hours a day, it should pay for replacing the incandescent bulb in about five years.
The manufacturing cost should fall as production yields rise and substrates grow. “At the moment, one-third of the LEDs in the world are made on 2-inch wafers,” says Mark McClear, who directs new business development at Cree. Toolmakers are now offering equipment for 3-inch, 4-inch, and even 6-inch substrates, and Cree plans to start using the largest of these platforms in the next 18 months.
Another way to drive down the cost is to increase the light output at a particular current. That’s one of the goals of the U.S. Department of Energy’s 2010 solid-state lighting road map, which calls for more than doubling the lumens per watt in commercial LED products by 2015. Or engineers could try to build better packages for handling additional heat so they could crank the current up higher and get more light out of each LED.
Of course, LEDs give us much more than just a more efficient, longer-lasting bulb. They’re small, cool in operation, and easy to place in walls, automobiles, appliances, even the heels of children’s shoes. When designers fully exploit their potential, LEDs will light up places we’d never thought to use them, thus changing the look of our world.
With price the only remaining hurdle and falling all the time, it’s clear that this technology will be a winner in the long run. The one potential casualty that no one is talking about: jokes about changing the lightbulb, which may be heading the way of the dodo.
For all of IEEE Spectrum’s Top 11 Technologies of the Decade, visit the special report.
About the Author
Richard Stevenson, a Ph.D. in physics based in Britain, has written many pieces for IEEE Spectrum on the incremental improvement of light-emitting diodes. But he says it was only this month’s story, “LED Lighting”, that allowed him to draw the big picture. He traces how the humble diode graduated from being the “on” button in your stereo to replacing the lightbulb itself through new techniques that squeeze ever more light out of chips.