In the spring of 1948, AT&T Bell Labs research director Mervin Kelly called Jack Morton, a rising young engineer into his office at the firm's Murray Hill, N.J., headquarters and offered him a plum assignment. "Morton, I'm going to be away for the next four weeks," Kelly announced. "When I get back, I would like to see a report from you on the transistor. I want you to tell me how to develop it commercially." Thus began one of the oddest chapters in modern electronics. The story of how AT&T pioneered and then squandered the technological lead in perfecting and capitalizing on the transistor is the dramatic thread of this month's feature "How Bell Labs Missed the Microchip", by contributor Michael Riordan.

The coauthor of Crystal Fire: The Birth of the Information Age and a lecturer at Stanford University and the University of California, Santa Cruz, Riordan takes us back to the birth of microprocessors and the people who foresaw their advantages and disadvantages in a new electronic age. Encompassing a stellar cast of players, including John Bardeen, Walter Brattain, Jack Kilby, and William Shockley (who would all win Nobel Prizes) at Bell Labs, Riordan focuses on Morton, who in the fall of 1948 was put in charge of the original transistor development team.

Morton was known inside laboratory circles as a can-do leader with strong opinions, according to Riordan. His group quickly worked out initial problems with Bardeen and Brattain's breakthrough point-contact transistor, and within a year they had two improved versions ready for production, one to amplify signals and the other for switching applications. Next, he championed a development process for growing purified semiconductor crystals, as well as integrating the work of researchers with that of the manufacturing teams at AT&T's subsidiary Western Electric.

By 1951, Bell Labs began fabricating the junction transistor, a more rugged and practical design than the delicate point-contact device. Conceived by Shockley and fashioned by chemist Morgan Sparks, this three-layer germanium sandwich had a much simpler structure than its forebear and far outperformed it. Morton's group led the way in getting the device into production within a year. After delaying a switchover to silicon as the material of choice for developing transistor technology, Morton finally urged his staff to make the change in 1955, and they soon produced the diffused-base transistor, which could amplify and switch signals above 100 megahertz.

However, rivals at Fairchild Semiconductor Corp., in Mountain View, Calif., led by Robert Noyce, surged ahead to adapt these silicon technologies to produce the first commercial microchips in 1961. Soon, the new trend was to build integrated circuits and to make them complex: large-scale integration (LSI), which yielded single silicon chips containing more than 1000 components, was born. Not for Morton, however. He derided LSI adherents as "large-scale idiots," pointing to the likelihood of failure in such devices. Instead, he promoted the idea of hybrid technology incorporating smaller-scale microchips, which could be manufactured with higher yields, into "thin-film" circuits based on metals such as tantalum, in which resistors and capacitors could be etched more precisely than was possible in silicon. Morton referred to this approach as "right scale of integration."

The decision cost AT&T the high ground in the explosive development of complex microchips in the 1960s—especially in a technology they had invented earlier, the metal-oxide semiconductor field-effect transistor, or MOSFET. As the electronics industry went one way, into ever more-sophisticated processors, Morton stubbornly insisted that the company stick with simple, discrete designs for the signal processing operations that were its bread and butter. In the end, AT&T could only watch as others such as Fairchild and Texas Instruments far outstripped them in commercializing the transistor.

According to Riordan, the failure to read the broader implications of LSI microchip design may have affected the self-certain Morton, who by the late '60s was hailed by the industry as a pioneer in his own right. Apparently, Morton became something of an alcoholic. In 1971, after being refused a late-night drink at a local tavern, he was mugged by two bar patrons while on his way home. In the attempt, the pair ended up killing him and setting his body on fire to cover up the crime.

It was a sad end for a man who had been hailed only six years before in his citation for the IEEE's prestigious David Sarnoff Medal for "outstanding leadership and contributions to the development and understanding of solid-state electron devices." Had he lived, he would be amazed how far the field he helped to plow had prospered.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.