The October 2022 issue of IEEE Spectrum is here!

Close bar

Chip Hall of Fame: Intel 8088 Microprocessor

The “castrated” processor that birthed the IBM PC

2 min read
8088 Microprocessor chip
Image: Intel

8088 Microprocessor chipImage: Intel

8088 Micro-processor

Manufacturer: Intel

Category: Processors

Year: 1979

Was there any one chip that propelled Intel into the Fortune 500? Intel says there was: the 8088. This was the 16-bit CPU that IBM chose for its original line of PCs, which went on to dominate the desktop computer market.

In an odd twist of fate, the chip that established what would become known as the x86 architecture didn’t have a name appended with an “86.” The 8088 was basically a slightly modified 8086, Intel’s first 16-bit CPU. Or as Intel engineer and 8086 designer Stephen Morse once put it, the 8088 was “a castrated version of the 8086.” That’s because the new chip’s main innovation wasn’t exactly a step forward in technical terms: The 8088 processed data internally in 16-bit chunks, but it used an 8-bit external data bus.

Intel managers kept the 8088 project under wraps until the 8086 design was mostly complete. “Management didn’t want to delay the 8086 by even a day by even telling us they had the 8088 variant in mind,” says Peter Stoll, a lead engineer for the 8086 project who did some work on the 8088.


Photo: Intel
With some favorable early press, Intel’s PR department was sure it had a winner.

It was only after the first functional 8086 came out that Intel shipped the 8086 artwork and documentation to a design unit in Haifa, Israel, where two engineers, Rafi Retter and Dany Star, altered the chip to an 8-bit bus.

The modification proved to be one of Intel’s best decisions. The 29 000-transistor 8088 CPU required fewer, less expensive support chips than the 8086 and had “full compatibility with 8-bit hardware, while also providing faster processing and a smooth transition to 16-bit processors,” as Intel’s Robert Noyce and Ted Hoff wrote in a 1981 article for IEEE Micro magazine.

The first PC to use the 8088 was IBM’s Model 5150, a monochrome machine that cost US $3,000. Now almost all the world’s PCs are built around CPUs that can claim the 8088 as an ancestor. Not bad for a castrated chip.

Photo: Konstantin Lanzet/Wikipedia

Only 8 of the 8088’s pins were used for sending data back and forth with other chips, even though internally the processor could handle data that was 16 bits wide.

The Conversation (0)

The Godfather of South Korea’s Chip Industry

How Kim Choong-Ki helped the nation become a semiconductor superpower

15 min read
A man in a dark suit, bald with some grey hair, leans against a shiny blue wall, in which he is reflected.

Kim Choong-Ki, now an emeritus professor at Korea Advanced Institute of Science and Technology, was the first professor in South Korea to systematically teach semiconductor engineering.

Korea Academy of Science and Technology

They were called “Kim’s Mafia.” Kim Choong-Ki himself wouldn’t have put it that way. But it was true what semiconductor engineers in South Korea whispered about his former students: They were everywhere.

Starting in the mid-1980s, as chip manufacturing in the country accelerated, engineers who had studied under Kim at Korea Advanced Institute of Science and Technology (KAIST) assumed top posts in the industry as well as coveted positions teaching or researching semiconductors at universities and government institutes. By the beginning of the 21st century, South Korea had become a dominant power in the global semiconductor market, meeting more than 60 percent of international demand for memory chips alone. Around the world, many of Kim’s protégés were lauded for their brilliant success in transforming the economy of a nation that had just started assembling radio sets in 1959 and was fabricating outdated memory chips in the early ’80s.

Keep Reading ↓Show less