After a reign of more than 130 years, the incandescent lightbulb has earned a spot in the pantheon of the most successful technologies humankind has ever produced. It casts its glow in homes, businesses, and societies that would be almost unrecognizable to those who first delighted in its rays, like tiny suns in graceful glass. Along with the landline telephone and the internal combustion engine, it underpinned and helped usher in our modern world. And now, it's going.
Brazil and Venezuela began abandoning incandescent lighting in 2005. Europe started taking traditional incandescent lightbulbs off store shelves in 2009. In the United States, California set standards that eliminated the traditional 100-watt incandescent this year, and the rest of the country will follow next year, with less-powerful incandescents falling like dominoes as stricter energy standards phase in.
We love the warm spectrum of the incandescent, but we're not so happy with its energy use. A typical 60-W bulb has an overall luminous efficacy of about 15, meaning that it radiates roughly 15 lumens per electrical watt consumed. In contrast, a comparable compact fluorescent would score about 65, and white light-emitting diodes (LEDs) approach 100. Even if you're not particularly worried about global climate change, the needless electricity expense should give you pause.
Nevertheless, many folks surprisingly cling to the mellow yellowish glow, refusing to go gently into an LED-lit night. In Germany, Austria, and a few other European countries, a curious phenomenon, incandescent-bulb hoarding, began in 2009. Californians, too, raced to grab the last of the available 100-W incandescents off the shelves earlier this year.
These people are among the last holdouts in a love affair with incandescent lighting that goes back a century and a half. For them, and for all of us who have spent years basking in incandescence, we offer a fond farewell to a brilliant notion.
The incandescent lamp originated in experiments in the early years of the 19th century, when researchers heated thin wires by passing electric current through them. Their goal was to produce light by making a wire white hot, but it wasn't easy.
William Grove, a professor of chemistry at the Royal Institution in London, published his experiments in Philosophical Magazine in 1844. He reported that he had heated "a coil of platinum wire as near to the point of fusion as was practicable" and that it gave enough light to read by. He used the rare and expensive metal platinum because it was the only material that could be made white hot in air without melting or bursting into flame.
Happily enough, the material eventually adopted for the filament, as we now call it, was carbon, not platinum. Put carbon in a vacuum and you can heat it to a higher temperature than any metal without burning. But creating that vacuum was problematic. Finally, around 1870, Hermann Sprengel, a German chemist working in London, developed his vacuum pump. Sprengel connected a vessel to a narrow vertical tube; drops of mercury falling down the tube sucked air out of the vessel, pushing it down to the bottom of the tube where it escaped.
With an effective vacuum pump available, several inventors soon produced working lamps. In August 1881, four of them—Thomas Edison and Hiram Maxim from the United States and Joseph Swan and St. George Lane Fox-Pitt from England—displayed their bulbs in Paris at the International Exposition of Electricity, which was attended by almost a million people. The lamps differed mainly in the choice of the material that was charred in a furnace to create the carbon filament. Edison began with bamboo fiber, Swan with cotton, Maxim with paper, and Fox-Pitt with grass. The Exposition Jury measured the efficiency of the various lamps, expressing that efficiency as candlepower—the light generated by a typical wax candle—per horsepower required to produce it. Edison's lamps ranked as the most efficient, giving 196 candlepower of light per horsepower applied to the generator. Maxim's gave 151, and the other two scored in between. The jury didn't consider how long each lamp lasted, and at that time it wasn't generally appreciated that the life of a filament lamp and its efficiency are interchangeable.
By this time the general public was eager for electric light in the home. People were already familiar with the brilliant light of the electric arc, which was created by passing a current between two electrodes. The arc lit up very large buildings and streets but was far too bright for home use. Filament lamps were more suitable; they were smaller and lit a room comfortably.
But all electric lights need an electricity supply, and this was not readily available at first. Cities that were planning street lighting, along with a few wealthy people planning to light their homes, installed their own generating plants. But ordinary middle-class folks had to wait for a public supply with electricity generated in a central station and distributed by wires to customers. Public power wasn't available anywhere until the early 1880s, but from then on, the demand for electric lighting in homes drove very rapid growth of the power industry in the United States, the United Kingdom, and most of Europe.
The first public supply was set up in Godalming, in southern England, in late 1881. It was built by two men named Calder and Barrett, about whom very little is known—not even their first names. A generator built by Edison began operating in London early in 1882, and another at Pearl Street, New York City, later that year. Several companies and some local governments then established generating stations on both sides of the Atlantic.
Before it could flourish, electric lighting had to defeat an entrenched competitor in many towns—gas. In the mid- and late 19th century, gaslight simply meant a bare gas flame, so the electric lamp, which produced no smoke, was a clear winner. But the gas industry didn't give up easily; it developed the gas mantle during the 1880s, then greatly improved it in the 1890s. This fine mesh, made mainly of thorium oxide, became incandescent when heated by a gas flame smaller than what had been used in previous generations of gas lamps. This technology also allowed the industry to change the composition of its gas so that the flames produced more heat and less light—and less smoke. The gas mantle turned out to be a cheaper source of light than the carbon filament lamp. Score round one to gas.
Starting around 1899, electricity answered the gas mantle with the metal filament, which could be operated at a hotter temperature, and therefore more efficiently, than a carbon filament. Developers tried several different filament materials. Osmium, tantalum, and tungsten have the highest melting points in the metals family but differ in their malleability. Initially, lamp manufacturers used osmium, also seen in the tips of fountain pens and in some heavy-duty electrical contacts, and tantalum, which was first isolated in 1902. Tungsten was attractive because it has the highest melting point of all metals—just over 3400 °C. But its brittleness stymied developers who were trying to draw it into a thin wire. Then Alexander Just and Franz Hanaman, working in Vienna and Budapest, found that they could make tungsten filaments by mixing tungsten powder with a binder and then drawing that mixture into a wire and sintering it—that is, heating it until the particles adhere but do not melt. Hugo Hirst, of the (British) General Electric Co., working with Just and Hanaman, began producing tungsten lamps in 1909, in a factory in West London. William Coolidge, of the (U.S.) General Electric Co., found that if he compressed tungsten powder and hammered it, he could draw it into a wire without using any binder, which was a simpler process. (There was no connection between the two General Electric companies.) Thus was born, in 1911, the drawn-tungsten filament incandescent lamp. It continues to be the standard in incandescent bulbs to this day, 100 years later.
In early tungsten lamps, the filaments sat in near vacuums, but it turned out that a little nitrogen or argon reduced the evaporation of the metal and prolonged the filament's life. The problem was that the gas also cooled the filament, making the lamp less efficient. Winding the filament in a coil reduced the cooling, and winding the coil itself into a coil, a technique developed in the early 1930s, worked even better. And that coiled-coil filament design has never been superseded.
In 1959, General Electric (U.S.) refined the filament lamp one more time. Its researchers sealed a tungsten filament into a compact bulb containing an inert gas and a small amount of a halogen, usually iodine or bromine. (The halogens are a group of elements that react very readily and energetically with other substances.)
In a halogen bulb, the halogen gas combines with the minute particles of tungsten that evaporate from the filament, which in ordinary incandescent lamps are deposited mostly on the inner surface of the bulb and over time gradually dim the light output. The tungsten halide that forms moves around as a gas and then, when it nears the hot filament, breaks down, redepositing the tungsten back onto the filament and releasing the halogen to repeat the process.
This halogen cycle keeps the bulb clean and the light output almost constant over the life of the bulb. The bulb temperature must be higher than in conventional incandescent lamps, too high for glass at the time, so the bulb was initially made of quartz. Because the first halogen lamps used iodine as the halogen, they were known as "quartz iodine" lamps. Later, bromine replaced iodine, higher-melting-point glass replaced the expensive quartz, and the lamps became "tungsten halogen" lamps. The bulbs soon caught on for spotlights and projectors and eventually for general lighting. Right now, because they are somewhat more efficient than the standard incandescent lamp, they are not on the chopping block in any country.
Alternatives to the ordinary tungsten incandescent bulb have long been available but found few takers for residential use until recently. People like the warm and brilliant glow of a tungsten incandescent, which in any case seems inexpensive compared with the other options. Of course, the familiar bulb is cheap only initially; in the long run, its inefficiency means a much higher operating cost and also more harm to the environment than the alternatives.
Commercial and industrial buildings have relied on fluorescent lamps since the 1940s. Discharging electricity in a long glass tube filled with a mixture of argon and mercury vapor produces ultraviolet light; a fluorescent coating on the inside of the tube turns the UV rays into visible light. Decades later, research into fluorescent materials and developments in control circuitry led to the compact fluorescent lamp. Launched by Philips simultaneously in Europe and the United States in 1980, these lamps fit into standard incandescent fixtures and used only a quarter of the electricity for a given amount of light.
Nevertheless, the heir apparent to the incandescent lightbulb isn't the compact fluorescent but rather the LED, which seems poised to dominate home lighting in the next decade. This semiconductor device came into commercial use in the 1960s, but the dim red or yellow LEDs available in those days weren't good for much besides indicating whether an electronic gizmo was on or off. Today, however, advanced and high-power LEDs throw off a brighter white light more efficiently than any other source. But their initial cost is higher, and the colors available don't map exactly with the familiar incandescent glow. Other alternatives, all experimental at this stage, are also in the offing. Ironically, one of them is an attempt to adapt for home lighting the technology that is now largely obsolete in television sets: the cathode ray tube—an electron beam hitting a phosphor.
Technology marches on, as it is wont to do. Soon, it will leave behind one of its most storied and successful creations. We'll miss the incandescent lamp, while wishing its successors an equally brilliant tenure.
This article originally appeared in print as "Lights Out".
About the Author
Brian Bowers is an electrical engineer who worked as an examiner in the British Patent Office, then became a curator in London's Science Museum. There he focused on the electrical engineering and lighting collections. For Bowers, lighting is something everyone can relate to at some level of complexity, from simple oil lamps and candles to the abstruse semiconductor physics of the LED.