More Treasured Texts

From the 1940s to the 1980s, these books led engineers all the way to semiconductors and the Information Age

12 min read
More Treasured Texts

computing calculator

The HP 9100A computing calculator allowed users to enter programs with a magnetic program card. From Computer Structures: Readings and Examples by C. Gordon Bell and Allen Newell.

It’s a cloudy night on the Isle of Wight. The year is 1940, and a soldier sitting in an underground command center is peering intently at the glow of a cathode-ray tube. Suddenly, echo pulses appear on the screen, as they have almost every night for the past two months. Leaping for the radiophone, the soldier alerts his commanding officer. In minutes, Spitfire and Hurricane fighters are dispatched to intercept the incoming bombers over the English Channel.

That drama played itself out countless times during World War II. But regardless of who was attacking whom, the star was always radar. And radar, along with the proximity fuze and digital electronic computing, was part of a series of important inventions that transformed electrical engineering, EE education, and, particularly EE textbooks, in the middle of the 20th century. The transformation began before the war had even ended, when EE educators, particularly in the United States, began confronting the fact that it had been physicists, more than engineers, who had achieved many of the key breakthroughs in electronics.

The academics concluded that the EE curriculum had become too focused on practice and technique. What’s more, it had paid too little attention to the mathematical and physical fundamentals underlying electrical technologies, in particular, Maxwell’s equations. This realization prompted renewed emphasis on theory.

One of the first EE textbooks to adopt a strongly theoretical approach did such a good job that it remains in print to this day. This intermediate-level textbook, Fields and Waves in Modern Radio (1944) by General Electric Co. engineers Simon Ramo and John R. Whinnery, was written not for university students but for engineers in GE’s advanced training program. It covers electromagnetic theory so thoroughly that the reader becomes adept at applying Maxwell’s equations to a wide range of problems.

The book became the best-selling textbook ever for John Wiley & Sons. Recent editions have added Theodore Van Duzer as a third author, and the title was changed to Fields and Waves in Communication Electronics. Whinnery became a professor at the University of California, Berkeley. Ramo became a missile magnate, founding with Dean E. Wooldridge in 1953 a company that was the driving force behind the early U.S. intercontinental ballistic missiles program. In 1958 Ramo-Wooldridge merged with Thompson Products and eventually became TRW Inc. (now part of Northrop Grumman Corp.).

The intent of Ernst A. Guillemin’s 1949 textbook is clear from its title: The Mathematics of Circuit Analysis: Extensions to the Mathematical Training of Electrical Engineers [see figure]. Although topics include matrices, quadratic forms, vector analysis, complex variables, and Fourier series, Guillemin took an engineer’s approach, emphasizing plausibility and general understanding of the techniques rather than mathematical rigor. He was an important contributor to linear network analysis and synthesis, and his Communication Networks (1931, 1935) is also a classic.

Japanese professors have so cherished Naohei Yamada’s Denki-jikigaku (Electromagnetic theory) that the book, first published in 1950, came out in its latest edition just months ago. The book is famous for its careful explanations of the basic equations of electromagnetic theory and for its many exercises dealing with actual devices and equipment. Makoto Katsurai, professor of electrical engineering at Tokyo University, who updated the latest edition, agreed to take on the job even though he was the author of a textbook on the very same subject.

In France, Yves Rocard, a professor at the École Normale Supérieure, took up where Yamada in Japan and Ramo and Whinnery in the United States left off. His Électricité, first published in 1951, provided a broad foundation for all types of electrical and electronic engineering (giving much attention to l’électron libre), and sought to enhance students’ understanding of the science behind electrical engineering.

Rocard’s book is often compared to one by Karl Küpfmüller, Einführung in die theoretische Elektrotechnik (Introduction to theoretical electrotechnology). Both books analyzed the design and performance of actual devices, such as dynamos, galvanometers, radio receivers, and television cameras. [For a discussion of the Küpfmüller book, see "Treasured Texts," IEEE Spectrum, April 2003, pp. 44-49.] But Électricité went further, including discussions of magnetrons, waveguides, and other developments. For two or three decades, the book was the standard for French EE students, and it was translated into German. Despite having written such a successful book, Rocard is known today mainly as the father of the political leader Michel Rocard, who was prime minister of France from 1988 to 1991.

The art and science of circuit design

The invention of the transistor in 1947 and of the integrated circuit in 1959 kicked the postwar electronics boom into high gear. Both events set the stage for a vigorous expansion in applications. By the 1960s, scientists and engineers of all sorts were using electronics for instrumentation, control, and information processing. As the demand for new applications grew, designing electronic circuits became an important skill for large numbers of people.

One of the most valuable circuit design books had humble beginnings. In 1969 two students at the University of Stuttgart, Ulrich Tietze and Christoph Schenk, took a systematic approach to preparing for the E-Technik Prüfung (Germany’s exam for a master’s degree). They collected material on all aspects of circuit design and wrote detailed summaries. Fellow students clamored for copies of this comprehensive overview, but photocopying machines at that time were few. So a professor, who had connections with German publisher Springer-Verlag, urged its publication.

The result was astounding: enormous sales, 12 editions (so far) in German, many editions in six other languages, and one of the most successful technical books ever for Springer-Verlag. Halbleiter-Schaltungstechnik (the English version is titled Electronic Circuits: Design and Applications), expanded and updated over the years, teaches students not only to analyze circuits mathematically, but also to design circuits for particular applications. Offering many great how-to examples, it covers analog and digital circuits and emphasizes the use of commercially available ICs.

The Book I Remember

NORMAN R. AUGUSTINE (F)

I am an avid reader and so it’s tough to come up with a single favorite text. Forced to do so, however, I suspect it might be William L. Shirer’s The Rise and Fall of the Third Reich (1960). Somehow this book awakened an idealistic young man to the fact that, sadly, there are folks in the world that wish to do others great ill—and that the answer in dealing with them is not one of appeasement.

* The author is chairman of the executive committee of the U.S. National Academy of Engineering (Washington, D.C.), former president and CEO of Lockheed Martin Corp. (Bethesda, Md.), and an adjunct professor at Princeton University (New Jersey)

PATRICK P. GELSINGER (M)

The book that literally changed my life is Structured Computer Organization (1976), by Andrew S. Tanenbaum. It was used in the first class I had ever taken in computer architecture. I was very excited about the class and bought the book at the end of the previous school year. By the start of class in the fall semester, I had finished taking notes on the book and had worked every problem. To my surprise, the professor was using the text for the first time and had only worked his way through chapter 2. My notes and solved problems became the basis for the entire class. From this point forward, I knew I wanted to be in computer design and architecture.

* The author is chief technology officer of Intel Corp. (Santa Clara, Calif.)

RAY OZZIE (M)

In the development of Lotus Notes and now Groove, I have spent most of my career at the intersection of people, organizations, and technology. Computer Lib: Dream Machines (1987), by Ted Nelson, reinforced my belief that computers could be utilized for more than just data processing. It also influenced a philosophy that I have followed throughout my career: that it’s O.K. to think differently.

* The author is founder, chairman, and chief executive officer of Groove Networks Inc. (Beverly, Mass.)

Another electronics textbook that became an international best-seller—and how likely is that?—was The Art of Electronics (1980) by Paul Horowitz and Winfield Hill. Because both authors were themselves prolific in designing new electronic instruments, it is not surprising that the book is renowned for presenting the techniques that circuit designers actually use. Besides basic principles and general guidance, the text points out, there are ”back-of-the-envelope techniques,” ”nuts-and-bolts recommendations,” and ”a large bag of tricks.” Since the authors meant to convey the art, rather than the theory, of electronics, the title was carefully chosen. Translated into a dozen languages, the book became a standard textbook in countries around the world. A second edition appeared in 1989.

In Latin America, one of the most important textbooks on circuit design is the two-volume Análisis de Modelos Circuitales (Analysis of circuit models) (1981, 1982) by Héctor O. Pueyo and Carlos Marco, professors at the Universidad Tecnológica Nacional in Buenos Aires. The book, which has been used in numerous Latin American countries, presents circuit theory and shows, through many examples and exercises, the application of the theory in practical devices. The two volumes have gone through several editions, and a third volume, dealing with computer simulation, is scheduled for publication this year.

Designing very large ICs

The 1960s and 1970s were a heady time for electronics, as the IC came into its own and lived up to the prophecy first articulated by Intel Corp. founder Gordon E. Moore in his famous Moore’s Law. Just as Moore had predicted, in a 1965 article in Electronics magazine, the number of transistors that engineers could put on an IC doubled periodically, roughly every year in those days.

By the end of the 1970s, IC design engineers were dealing with many thousands of transistors, and it was clear that a systematic methodology for designing circuits was sorely needed. A breakthrough, both in EE education and in industry practice, was Carver Mead and Lynn Conway’s Introduction to VLSI Systems (1980), which led the way in showing students how to design complex ICs. (VLSI stands for very large scale integration.)

In 1975, Mead, at the California Institute of Technology (Pasadena), and Conway, at the Xerox Palo Alto Research Center (California), began a collaboration to find simplified methods of VLSI design. At the time, Conway was working on a special-purpose architecture for image processing, and she was bothered by ”the gap between the sorts of systems we could visualize and what we could actually get into hardware in a timely way.” Mead had for some years worked at computer-aided design, that is, automating, to some degree, the design of integrated circuits.

The two developed a methodology of VLSI design and tested it in courses at the California Institute of Technology and the Massachusetts Institute of Technology (Cambridge), where Conway was a visiting professor in 1978. Just a year after its publication, Introduction to VLSI Systems was in use in more than a hundred universities.

The new technology of computers

Probably the most momentous technical advance of World War II was the electronic digital computer, notably the Colossus in England for code breaking and the Eniac in the United States for calculating ballistics tables. Yet one of the first textbooks on computers came out of war-devastated Japan.

The Eniac, shrouded in secrecy during the war and immediately afterward, was publicly inaugurated on 15 February 1946. Newsweek magazine then carried an article about the machine, and a particularly interested reader was Kenzo Jo, a professor at Osaka Imperial University (later Osaka University). Jo, who had been doing research on calculating machines since 1929, immediately began developing an electronic computer.

He first built an Eniac-type decimal arithmetic unit and then worked on a stored-program binary computer. One of his assistants was Saburo Makinouchi, who became an associate professor at Osaka University in 1951, and in 1953 Jo and Makinouchi published Keisan Kikai (Computing machines). The book, which remained in use for many years, surveyed computing technologies—electromechanical calculators, punch-card systems, differential analyzers, and electronic digital computers—and presented, in considerable detail, binary computer circuits.

One of the most influential books on computers was Computer Structures: Readings and Examples [see figure] (1971) by C. Gordon Bell and Allen Newell, professors at Carnegie Mellon University (Pittsburgh). The authors surveyed computer structures—both in component configurations and in instruction sets—through actual computers rather than abstractions, and their book introduced a highly useful taxonomy—processor-memory-switch (PMS) notation for overall system structure and instruction-set-processor (ISP) notation for processor architectures.

Teaching can be a stimulus to invention, and it certainly was with this book. While working on it, Bell and Newell came up with two seminal ideas in computer design. One was the general-registers idea, that is, registers that could operate in various ways, as stack pointers, index registers, accumulators, program counters, and so on. Another was the Unibus protocol, which was introduced on the PDP-11, Digital Equipment Corp.’s hugely successful minicomputer. An updated version of the book, prepared by Daniel P. Siewiorek, Carnegie Mellon professor of electrical and computer engineering, appeared in 1982.

A dominant trend of late-20th century computing was the linking of large numbers of computers into networks. Probably the most influential textbook dealing with this trend has been Computer Networks (1981) by Andrew S. Tanenbaum, professor at Vrije Universiteit, in Amsterdam, the Netherlands. Written in English, the book has been translated into 17 languages and has sold more than half a million copies. Because of the continual changes in network hardware and software, subsequent editions in 1988 and 1996 were largely new books.

As Tanenbaum points out in the book’s third edition, the three editions correspond to different phases in the development of networks. In 1981, networks were an academic curiosity; in 1988, they were used by universities and large businesses; and in 1996, networks, and especially the Internet, were a part of daily life for hundreds of millions of people. Wireless networks, the latest phase, is part of the subject matter of a fourth edition of Computer Networks, to be published later this year.

Making computers useful

Programs for the first computers were necessarily written in machine instructions, but in the early 1950s, John Backus, Grace Murray Hopper, and others promoted ”automatic programming,” the process of writing programs in a higher-level code that the computer would convert into machine instructions. Fortran became the standard for scientific computing, Cobol for business computing. The most successful teacher of the new programming languages was the self-employed author and consultant Daniel D. McCracken.

In 1961, McCracken’s A Guide to FORTRAN IV Programming appeared [see figure]. It and later editions were used in technical schools and universities around the world. A statement in the introduction reminds us that in 1961 computers were few and far between: a computer is not necessary, ”but if one is available it should be utilized at every opportunity.” The books are clearly written and amply provided with flow charts, sample programs, and exercises (with answers in the back, something I remember appreciating as a user of the book ages ago).

In the early 1970s, Kenneth Thompson and Dennis M. Ritchie at Bell Telephone Laboratories were working on what became the Unix operating system. They wanted an easy and clear language for implementing Unix, but one that would retain the power of assembly language. Thompson created the first version, called B; Ritchie, the second version, called C. (There was never an A; one explanation of the name B is that it comes from BCPL, the name of a language designed by Martin Richards that was Thompson’s point of departure. BCPL got its name from an offshoot of CPL, the Combined Programming Language.)

Brian W. Kernighan, also at Bell Labs, collaborated with Ritchie in producing The C Programming Language (1978). The growing popularity of C and Unix and the success of the book reinforced each other. Indeed, the book served as the C programming language reference until a formal standard was adopted a decade later. A second edition appeared in 1988, and the book has been translated into 22 other (human) languages.

Typesetting books

It is fitting that books about computers introduced computer-based techniques to the publishing industry. Prior to the 1970s, a book was prepared as a typewritten manuscript, then set in metal type. After proofreading and correction, the type was used to print one copy of the book’s pages, which were photographed for the offset process. All this took a year or two in most cases and cost roughly US $20 a page.

Things changed in the 1970s, when Bell Labs developed Unix-based software to control typesetters, making it possible for authors to typeset their own books and produce camera-ready copy. This development reduced production cycles to a couple of months and cost very little. Kernighan and Ritchie’s The C Programming Language and Tanenbaum’s Computer Networks were early examples of texts produced in this way.

Even as Thompson and Ritchie were developing Unix, Niklaus Wirth, then a professor of computer science at the Eidgenössische Technische Hochschule (ETH, the Swiss Federal Institute of Technology) in Zurich, was creating Pascal and other computer languages. He was also an innovative teacher. In 1972 he published Systematic Programming (German edition in 1972, English edition in 1973) and, three years later, the more advanced Algorithms + Data Structures = Programs. The latter contains chapters on sorting, recursive algorithms, dynamic data structures, and language structures and compilers. Written from a practical viewpoint, the book teaches stepwise development of well-structured programs and emphasizes performance analysis. It has appeared in a dozen languages and five editions.

Final thoughts on great texts

The character of a profession depends to a large degree on the education its members receive, and that education also depends on the textbooks that convey the technical knowledge, problem-solving techniques, and vision of the profession. Textbooks not only hold a great deal of useful information, but also present an intellectual framework for subsequent studies and experience.

What makes a textbook great is the lucid explanation it provides, emphasizing timeless principles but relating them to useful techniques. Great textbooks challenge the student to turn understanding into a tool to solve practical problems and convey a comprehensive vision of a technical area, inspiring the student to master large parts of it and even to extend the field.

How the Books Were Chosen

Spectrum’s review of classic textbooks is not absolute or all encompassing. It is rather a subjective sampling of great texts that have had many readers over a long period of time and have profoundly influenced those readers. Whether a text is a classic might be measured, at least roughly, by how many copies were printed and how long the book remained in print. Another metric is nothing less than the warmth of readers’ testimonials. Dozens of people have told me what they regard as classic textbooks; and their testimonials helped me to make my selections.

Also, I wanted the selection of textbooks to illustrate the development of electrical technologies from the 1880s to the 1980s. So the books date from 1884 to 1981 and cover a wide range of technologies. (In the April issue of IEEE Spectrum [pp. 44-49], I described texts written during the formative years of the electrical engineering profession—from 1884 to the mid-20th century.) Some more recent books, which, in coming years, may well become classics, are listed in ”Up-and-Coming Classics?”.

I wanted, too, to indicate the transnational character of the technology, so although there is an emphasis on books from the United States, almost half come from other countries. Finally, there was space for only a couple of dozen textbooks, so scores of unquestionable classics have not been named. I know that Spectrum readers have their own cherished texts, and I invite them to add their opinions to this discussion.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions