Slideshow - More Core Memories

Memories Are Made of This

4 min read

Slideshow - More Core Memories

There are coffee-table books about birds, surrealism, mountain climbing, true crime magazines, record album covers, comic books, and now there's one drawn from the Computer History Museum in Mountain View, Calif. Our slide show presents nine images from Core Memory: A Visual Survey of Vintage Computers, recently published by Chronicle Books, San Francisco.

https://www.corememoryproject.com/main.php


Photo: Mark Richards

 In a 2001 article in the IEEE Annals of the History of Computing, Robert V. Head called UNIVAC I “an authentic vacuum-tubed monster.” He wrote, “The Univac operator’s console had more switches and lights than an airliner cockpit and was the antithesis of ‘user friendly.’ No programmer could operate it without extensive training and a good deal of manual dexterity.”
Shown here is the 1951 computer’s main memory, which used a mercury delay-line that used sound waves to send pulses through a tube of mercury, then detect and return them. Delay-line memory has long been superseded by electrostatic methods, still in use today, but at the time it was the more advanced technology.


Photo: Mark Richards

 For many years after the Semi-Automatic Ground Environment, or SAGE, system was delivered to the U.S. Air Force in 1954, it was the most highly developed computer network in the world. SAGE, the heart of the first U.S. missile defense system, was built to collect and analyze radar data in real time and transmit its results to fighter planes.
SAGE linked 27 different installations. Each took up half the size of a football field and contained two complete systems for redundancy. They were built by IBM, but the 7000 programmers who wrote their 250 000 lines of code were employed by the Rand Corp.


Photo: Mark Richards

 Some computer projects flop for lack of funding or as new technologies supersede one another. The ILLIAC IV might be the only supercomputer to founder because of antiwar protests. It was built in the early 1970s by Burroughs at the University of Illinois, with funding from the U.S. Defense Advanced Research Projects Agency (DARPA) to investigate the ability of multiple systems to process the same instructions on different data in parallel.
But its association with the “military-industrial complex” caught the attention of the activists protesting the Vietnam War. As a result, ILLIAC IV development was moved to a secure NASA research facility in Mountain View, Calif. Afterward, only 64 of its planned 256 processing elements were built, and the project was shut down in 1982.


Photo: Mark Richards

When it came to the first moon landings, there was plenty of technology backing the bravery of those pioneering astronauts. The brains of the Apollo spacecraft consisted of a primitive embedded computer with 5000 chips, each based on the then-new technology of integrated circuits. It had a total of 4 kilobytes of RAM and 24 kilobytes of ROM. By 1972, Pioneer 10 would launch with an Intel 4004 inside, a single chip with 2300 transistors.
MIT and Raytheon built the Apollo Guidance Computer for $250 000 in 1965. Each mission used two, one for the command module and one for the lunar module.


Photo: Mark Richards

Computing came into its own with the 1964 introduction of the IBM System/360 family of what we would come to call “mainframes.” The U.S. air traffic control system, NASA’s mission control centers, and the world’s largest corporations, universities, and government agencies, all used them, and tens of thousands of programmers first learned APL, Fortran, and Cobol on a time-shared 360.
Its whirring tape drives, such as those shown here, represented the “eyes” of computing in countless movies and magazine pages, and they were probably abstracted to form the glowing red “eye” of HAL 9000 in the movie 2001: A Space Odyssey.


Photo: Mark Richards

The architect of the IBM 360 was computer pioneer Gene Amdahl, who got his start designing computers in 1955 with his physics Ph.D. project, the Wisconsin Integrally Synchronized Computer, or WISC. Operating on a student’s budget, he made no investment in fit and finish, yet the machine is beautiful in its own way. Very few individuals can claim to have built a mainframe single-handedly.
After IBM, Amdahl would go on to form Silicon Valley’s first mainframe computer company, which he named after himself. He eventually started two other companies and would win the ACM/IEEE Eckert-Mauchly Award in 1987 and the IEEE Computer Entrepreneur Award in 1989.


Photo: Mark Richards

The computer industry started not in Silicon Valley, but in Philadelphia, where the ENIAC, EDVAC, BINAC, and UNIVAC were built by pioneers J. Presper Eckert and John William Mauchly. In their shadow toiled the Philadelphia Electric Co. (Philco for short), an early maker of transistors initially used to manufacture radios. Its first computer, the SOLO, was bought by the U.S. National Security Agency.
Its best-known computer was the Philco 212, built in 1962, shortly before the company was purchased by the Ford Motor Co. The 212 was one of the first computers to prefetch instructions, and it used multiple registers to store operands. Its 2-microsecond memory was considered speedy in its day.


Photo: Mark Richards

By the late 1960s, computers were getting small enough that it was possible to think of one in the home. Neiman Marcus commissioned Honeywell to create a kitchen computer for its famous catalog.
Despite being featured on the cover of the store’s 1969 catalog, the “Kitchen Computer”—officially, the H316 Pedestal Model—had an integrated cutting board and chair, and shipped preloaded with cooking recipes. It went for a mere $10 600, yet remarkably not a single one was sold, despite having a fully functional minicomputer under its sculpted red-and-white hood. (By comparison, Datsun’s sexy 240Z sports car, also new that year, cost $3500.)


Photo: Mark Richards

Of all the technologies to come out of Xerox’s legendary Palo Alto Research Center (PARC)—the graphical interface, the mouse, laser printers, Ethernet, and PostScript, to name a few—the most revolutionary might have been its 1973 SuperPaint computer. This ordinary Xerox Alto was equipped with hundreds of thousands of dollars of memory and software genius Dick Shoup’s SuperPaint program. Its key technology was the world’s first frame buffer, which allowed Shoup to import video and edit it.
Though no SuperPaint computers were ever sold, there is a clear line of development from them to George Lucas’s Industrial Light and Magic (ILM) and to Pixar. Along the way, SuperPaint’s descendents were used on Return of the Jedi, Star Trek II: The Wrath of Kahn, and Luxo Jr., which was nominated for an Academy Award in 1986 for Best Short Animated Film. One such descendent was Renderman, the program behind the special effects in The Abyss and Terminator 2: Judgment Day.
 

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions