The Runners-up: More Earthshaking Chips

These 13 great little chips didn't make our list--mainly because we ran out of space in print. And, well, one isn't even a chip

Image: Intel

This is part of IEEE Spectrum's Special Report: 25 Microchips That Shook the World.

Image: Cray

Cray Research Cray-1 CPU (1976)

Seymour Cray liked to build powerful computers. He also liked to build beautiful computers. The Cray-1, with its stylized chassis arranged in the shape of a C (as in Cray —get it?), came out in 1976. It might as well have come from the future. Wired by hand, Freon-cooled, a padded seat circling the chassis. In its guts resided tight piles of circuit boards, each crammed with up to 144 chips—memory chips and high-speed logic chips. So the Cray-1 had no microprocessors; instead, the whole machine acted as one 5-metric-ton processor. With 64-bit registers, it ran at 80 megahertz and crunched numbers using a technique called vector processing. Basically, the logic chips fetched long strings of numbers (vectors) from memory and computed them all at once. The C shape actually helped keep wires short and was thus a critical aspect of the high-speed design. The first Cray-1—80 megaflops, 8 megabytes of main memory—went to Los Alamos National Laboratory. It cost US $8.8 million, at a time when a million bucks was real money. Cray pioneered the field of supercomputing, but as a business the company enjoyed mixed results. In the 1980s, the high-performance computing market was dominated by massively parallel machines, which were less expensive and more flexible than Cray’s. They just didn’t look as good as a Cray.

Image: IBM

Intel 1702 2048-bit EPROM Chip (1971)

Ultraviolet radiation is known as an effective killer of bacteria. But for a time in the 1970s and 1980s it was also used to kill data. Erasable programmable read-only memory, or EPROM, chips had little quartz windows through which you’d shine a UV beam to wipe out the data and thus be able to reprogram the device. The idea of using ultraviolet to erase a memory chip sounds preposterous today. But at the 1971 IEEE International Solid-State Circuits Conference (ISSCC), when an Intel engineer named Dov Frohman-Bentchkowsky introduced the concept, it became all the rage. Back then, memory chips were reprogrammable but volatile (losing their data when power was removed) or simply not reprogrammable at all (ROM chips received data at the time of fabrication and that’s that). In his ISSCC paper, ”A Fully-Decoded 2048-bit Electrically-Programmable MOS ROM,” Frohman-Bentchkowsky showed how to construct a special transistor that worked as a nonvolatile memory element. The trick was to trap electrons in the transistor’s floating gate by applying a voltage. The electrons would remain trapped even when the voltage was removed, and the only way to free them was by shining some UV light (or X-ray) through them. Intel’s first EPROM chip was the 1702, and it could store 2 kilobits. Others soon followed. Engineers, tantalized by the possibility of reprogramming a memory, rushed to get hold of EPROM chips—and some UV lamps as well.

Image: Dirk Oppelt

IBM/Sony/Toshiba Cell Processor (2004)

Take a flake of silicon. Etch in a PowerPC processor. Add eight very fast number-crunching coprocessors. Connect everything through a 300-gigabit-per-second data bus. The result is one of the most radical microprocessors in semiconductor history—the Cell, developed by IBM, Sony, and Toshiba. It took them US $400 million, 400 people, and four years to get the chip out the door. Designed to feed on a high-flop diet of MPEG streams, three-dimensional game polygons, and Fourier transforms, the Cell is used in the Sony PlayStation 3 and high-powered servers and supercomputers. With an asymmetric architecture—two different kinds of processors—the Cell served notice that programming would never be the same. Computer scientists and programmers scrambled to find ways of exploiting its full potential. Five years later, they’re still at it.

Image: STMicroelectronics

Philips Semiconductors TDA93xx TV Processor (1999)

The Philips marketing people called it the ”ultimate one-chip TV processor.” The Philips engineers called it the TDA93xx. It was a single chip integrating a TV signal processor, a closed-caption decoder, and a microcontroller core. The company pulled it off by combining conventional complementary-metal-oxide semiconductor (CMOS) circuitry with analog BiMOS, a combination of bipolar and CMOS technology. The chip supported all three major international TV standards (PAL, NTSC, and SECAM), and by adding a microcontroller, Philips eliminated the need for several other components. TV manufacturers loved the chip, which worked well even in markets like Africa and China, known for quirks in their broadcast environments. Philips’s semiconductor division—now known as NXP—estimates that it has shipped over 850 million chips. It plans to ship millions more until analog TV is finally retired worldwide.

Lucent WaveLAN IEEE Wi-Fi Chip Set (1998)

To surf the Web wirelessly while sipping a latte is a great thing—few would dispute that. But it turns out that creating the wireless technology that made this staple of modernity possible involved heated debates among the world’s most powerful tech companies. Before Wi-Fi came along, a zoo of wireless standards roamed the spectrum; each company had its own technology. It took a Dutch engineer named Vic Hayes—equipped with UN-level diplomatic skills—to get all the companies on the same page (and even in the same room). As the chair of the IEEE 802.11 Working Group, Hayes articulated the wireless network standard that would become pervasive in homes, offices, schools, airports, and, of course, coffee shops. In 1996, the group released the first version of the 802.11 standard. Hayes then took it to the engineering department of his employer, Lucent Technologies, and his counterparts in the working group did the same. Of course, drafting a standard was one thing; implementing it in silicon in a profitable product was another. Lucent was among the first to create an 802.11 chip set, which it introduced in 1998 as WaveLAN IEEE, sold as part of wireless routers and modems. The chip set was soon updated to support the 802.11b revision, which could transmit data at an unprecedented 11 megabits per second and would popularize Wi-Fi technology. Lucent wouldn’t profit much from the WaveLAN IEEE line; it spun off its semiconductor division as Agere Systems in 2000 and rebranded the chip set as Orinoco, which was later acquired by Proxim. By then, Wi-Fi had gained a life of its own, and now Wi-Fi chips by Atheros, Broadcom, Cisco, Intel, Marvell, and others are part of every computer sold today. Another latte?

Image: AMD

DEC Alpha21064 Microprocessor (1992)

After the success of its MIPS and VAX processors, Digital Equipment Corp. entered the 1990s searching for its next big thing. The thing, DEC managers decided, would be a custom-designed CMOS 64-bit processor that they called the Alpha. The chip, based on a reduced-instruction-set computing (RISC) architecture, was touted as a piece of technical genius, with a slim instruction set and superfast operation. It quickly gained the admiration of the chip community. But in the end, it may have killed DEC. Kenneth H. Olsen, Digital's cofounder and president, faced the quintessential innovator’s conundrum: Should the company promote the new chip at the expense of the VAX, which had made DEC’s fortune in the 1980s? Olsen clung to the VAX, and the Alpha never really amounted to much commercially. Enticed by the Alpha chip, among other things, Compaq bought DEC in 1998. But four years later, before the company could do anything with the chip, Compaq was swallowed by HP, which never did anything with it either. Still, the Alpha family of processors is remembered fondly by a great many chip cognoscenti.

Image: NXP

STMicroelectronics STA2056 GPS Receiver (2004)

A time-honored design stunt in the world of chipmaking is the kill-two-chips-with-one-chip move. Several years ago, STMicroelectronics did it with GPS receivers. The European chip giant smashed together a GPS radio front end (which usually made up one chip) with a signal correlator, microprocessor, and memory (usually on the other chip). Although handheld GPS systems had already been marketed, the STA2056 set a new standard for size and power consumption. And at US $8, the chip was cheap, driving the cost of GPS devices down and helping open up a mass market for them. Fiat has used the chip in several Alfa Romeo models, and GPS vendor Becker put it in its handsets. And of course the two-chips-into-one trick remains a favorite of chipmakers everywhere.

Image: Stefan506

Advanced Micro Devices Opteron Processor (2003)

Since its founding in 1969, Advanced Micro Devices has delivered its share of innovative integrated circuits. There were the logic chips that got the company started, then the AMD 9080 microprocessor (a clone of Intel’s 8-bit 8080), and let’s not forget AMD’s Am2900 bit-slice family—4-bit chips that could be grouped to create 8- or 16-bit controllers, a method known as bit slicing. But most recently, the AMD chip that really stands out—one of many painful pebbles AMD has planted in Intel’s shoes—is the 64-bit Opteron processor. This CPU extended the 32-bit x 86 instruction set to a 64-bit architecture. What’s more, it incorporated an embedded memory controller and a high-speed chip-to-chip interconnect. The Opteron was aimed at servers, but AMD soon transplanted its innovations to the Athlon 64, targeted at consumer PCs. The two powerful AMD processors allowed users to take on computing tasks previously reserved for expensive RISC systems. The chips also forced Intel to add similar capabilities to its own x 86 products.

Image: United States Patent and Trademark Office

Philips TDA7000 FM Receiver (1983)

Engineers rarely forget their first encounter with a superheterodyne receiver. Ah, the satisfaction of building a neat FM radio from scratch! When it works, that is. No surprise, then, that some hobbyists received with enthusiasm the TDA7000, a remarkable-for-its-day complete FM radio on a chip, crammed with an RF input stage, a mixer, a local oscillator, an intermediate frequency amplifier, and a demodulator. It let do-it-yourselfers build decent FM receivers without having to deal with hard-to-adjust components like intermediate-frequency transformers. Designed as a compact, cheap chip for mono portable radios—which themselves need to be compact and cheap—the TDA7000 offered incredible performance. Surprisingly, though, it was never used in a major commercial product. Sold by Tandy, it remained a huge hit with kit builders and hobbyists for two decades, until Philips’s semiconductors division, now NXP, stopped making the part in December 2003. But variants of the chip are still around.

Texas Instruments SN7400 Logic Chips (1966)

In the prehistoric days of digital computers—they were called minis and mainframes back then—engineers sought to replace discrete transistors with integrated circuits that could perform basic logic operations. In these premicroprocessor days, engineers were working with simpler fare—chips like the Texas Instruments SN7400, an IC with four two-input NAND logic gates. This chip was just one in a very big family—known as the 7400 series—that did very well for TI. Unlike earlier logic chips, which used diodes and transistors (called diode-transistor logic, or DTL), the 7400 chips were based on all-transistor logic circuits (transistor-transistor logic, or TTL). Texas Instruments’ TTL Data Book for Design Engineers became, as ”The Silicon Engine” exhibit at the Computer History Museum puts it, ”the indispensable data manual of the late 1970s.” Today the 7400 family is still in production, sold by TI and many others. Among the hundreds of 7400 siblings are simple chips like the SN7404, which contains a bunch of hexadecimal inverters, and more complex ones like the SN74ALVC162334, a 16-bit universal bus driver with three-state outputs. Surely more than a few engineers still keep the TTL data book within arm’s reach.

IBM POWER Processor (1990)

You’ve heard of the IBM PC. But have you heard of the IBM RT? It was a Unix workstation that never made much money for IBM but was nevertheless a major milestone for Big Blue. That’s because the RT used a pioneering processor based on the concept of reduced-instruction-set computing, or RISC (RT stands for RISC Technology). From there, IBM went on to develop two other seminal RISC chips: the POWER and the PowerPC. (IBM really loves acronyms: POWER stands for Performance Optimized With Enhanced RISC.) These two families of 32-bit CPUs are at the heart of many of IBM’s flagship servers, workstations, and supercomputers. The first server and workstation product line to use the POWER processor was the RISC System/6000, or just RS/6000, introduced in 1990. Four years later, the RS/6000 received an upgrade with the newly released PowerPC, which IBM developed jointly with Apple and Motorola. A host of processors followed, including radiation-hardened space CPUs, versions for game systems like the Nintendo Wii and the Microsoft Xbox 360, and the multicore Cell processor.

Silicon Laboratories Si4905 GSM Cellphone Chip (2005)

Announced by Silicon Laboratories in 2005, the Si4905 put an entire GSM cellphone—including the digital baseband, RF circuitry, power management, and battery interface—into about one square centimeter of silicon. Normally, in those days, such a phone required over 200 electronic components. The tricky part was integrating the RF and digital subsystems, which are normally separated and shielded. But after exhaustive tests and redesigns, the company pulled it off. Despite offering manufacturers an easy way to produce entry-level phones at a much lower cost, Silicon Labs was simply too small to compete against behemoths like Texas Instruments and Infineon Technologies. It never got a customer. In 2007, Silicon Labs sold the GSM chip line to NXP—which now supplies the chip to Samsung, the world’s No. 2 handset maker.

Communications Services Corp. RFID (1973)

Mario Cardullo invented the first radio-frequency identification (RFID) device in 1969. It wasn’t a chip.* It was a circuit the size of a cigarette pack, with a receiver, a transmitter, and 16 bits of nonvolatile memory, which he built using ferrite cores. And it wasn’t called an RFID. His patent—filed in 1970 and issued in 1973—called the contraption a ”transponder apparatus and system.” Cardullo founded Communications Services Corp. to market the system. He thought it could be used for electronic toll collection, among other things. In 1971, he gave a demonstration to the Port Authority of New York & New Jersey, which snubbed him. ”They said people would never carry these things in their cars,” says Cardullo, now a technology consultant in Alexandria, Va. He eventually moved onto other things, and Communications Services never sold the system. In 1990 the patent expired. But Cardullo wasn’t the only person to think of identification devices that could be read at a distance. Engineers working on that very idea—at Los Alamos National Laboratory, Raytheon, and other places—took RFID technology to the market. Today RFID is used in smart cards, entry cards, merchandise tags, and passports. The technology is used to track pets, cattle, inventory, and, well, people—there are RFID chips encased in tiny glass capsules that can be implanted in the human body. And of course, RFID is used for electronic toll collection in New York and elsewhere. ”The other day my EZ Pass wasn’t working,” Cardullo says of his toll pass, ”and, sure enough, they sent me a ticket.”

*If it’s not a chip, why did we include it here? Er…you got us. We just thought we needed RFID on the list—it’s a multibillion-dollar industry, and these chips are everywhere these days. Also, Mario is a cool guy.

For more articles, go to Special Report: 25 Microchips That Shook the World.

Related Stories

Advertisement
Advertisement