A Quadrillion Mainframes on Your Lap

Your laptop is way more powerful than you might realize

3 min read
Black and white photograph of a man in a large room with mainframe computer elements lining the walls

The IBM 7090 was the first line of transistorized computers, in the early 1960s. It was based on the 709 line, which used vacuum tubes.

Gamma-Keystone/Getty Images

Whenever I hear someone rhapsodize about how much more computer power we have now compared with what was available in the 1960s during the Apollo era, I cringe. Those comparisons usually grossly underestimate the difference.

By 1961, a few universities around the world had bought IBM 7090 mainframes. The 7090 was the first line of all-transistor computers, and it cost US $20 million in today's money, or about 6,000 times as much as a top-of-the-line laptop today. Its early buyers typically deployed the computers as a shared resource for an entire campus. Very few users were fortunate enough to get as much as an hour of computer time per week.

The 7090 had a clock cycle of 2.18 microseconds, so the operating frequency was just under 500 kilohertz. But in those days, instructions were not pipelined, so most took more than one cycle to execute. Some integer arithmetic took up to 14 cycles, and a floating-point operation could hog up to 15. So the 7090 is generally estimated to have executed about 100,000 instructions per second. Most modern computer cores can operate at a sustained rate of 3 billion instructions per second, with much faster peak speeds. That is 30,000 times as fast, so a modern chip with four or eight cores is easily 100,000 times as fast.

Unlike the lucky person in 1961 who got an hour of computer time, you can run your laptop all the time, racking up more than 1,900 years of 7090 computer time every week. (Far be it from me to ask how many of those hours are spent on Minecraft.)

Continuing with this comparison, consider the number of instructions needed to train the popular natural-language AI model, GPT-3. Executing them on cloud servers took the equivalent of 355 years of laptop time, which translates to more than 36 million years on the 7090. You’d need a lot of coffee as you waited for that job to finish.

A week of computing time on a modern laptop would take longer than the age of the universe on the 7090.

But, really, this comparison is unfair to today’s computers. Your laptop probably has 16 gigabytes of main memory. The 7090 maxed out at 144 kilobytes. To run the same program would require an awful lot of shuffling of data into and out of the 7090—and it would have to be done using magnetic tapes. The best tape drives in those days had maximum data-transfer rates of 60 KB per second. Although 12 tape units could be attached to a single 7090 computer, that rate needed to be shared among them. But such sharing would require that a group of human operators swap tapes on the drives; to read (or write) 16 GB of data this way would take three days. So data transfer, too, was slower by a factor of about 100,000 compared with today’s rate.

So now the 7090 looks to have run at about a quadrillionth (10-15) the speed of your 2021 laptop. A week of computing time on a modern laptop would take longer than the age of the universe on the 7090.

But wait, there’s more! Each core in your laptop has built-in SIMD (single instruction, multiple data) extensions that turbocharge floating-point arithmetic, used for vector operations. Not even a whiff of those on the 7090. And then there’s the GPU, originally used for graphics speedup, but now used for the bulk of AI learning such as in training GPT-3. And the latest iPhone chip, the A15 Bionic, has not one, but five GPUs, as well as a bonus neural engine that runs 15 trillion arithmetic operations per second on top of all the other comparisons we have made.

The difference in just 60 years is mind boggling. But I wonder, are we using all that computation effectively to make as much difference as our forebears did after the leap from pencil and paper to the 7090?

This article appears in the January 2022 print issue as “So Much Moore.”

The Conversation (5)
Wade WOOLSEY 28 Dec, 2021

I do not think so but my mind could be changed if net fusion and spaceships capable of tending observatories like the Webb by 2030 I would change my mind.

Thomas Fleisher 25 Dec, 2021

Awesome article, thank you! With this same logic in mind, any idea how many mid-70s Voyagers, both now billions of miles away in outer space, it would take to compare to a single iPhone with the aforementioned A15 Bionic chip? Always been curious about it. The Voyagers are each the size of a small school bus. Thanks!


Ron Hoover 23 Dec, 2021

The programmers back then were geniuses that squeezed every last drop of efficiency out of their hardware.

Today's code is wantonly wasteful of memory and CPU time.

In machine controls and robotics the extra processing power has proved useful but the democratization of computers and catering to the lowest common denominator of user has not done the world any favours.

Sure I can order fast food and cheap tat from china delivered to my doorstep , which allows me more time to watch internet porn and cat videos and while these things are very convenient I am not sure that in the great scheme of things that it represents progress.

Top Tech 2022: A Special Report

Preview two dozen exciting technical developments that are in the pipeline for the coming year

1 min read
Photo of the lower part of a rocket in an engineering bay.

NASA’s Space Launch System will carry Orion to the moon.

Frank Michaux/NASA

At the start of each year, IEEE Spectrum attempts to predict the future. It can be tricky, but we do our best, filling the January issue with a couple of dozen reports, short and long, about developments the editors expect to make news in the coming year.

This isn’t hard to do when the project has been in the works for a long time and is progressing on schedule—the coming first flight of NASA’s Space Launch System, for example. For other stories, we must go farther out on a limb. A case in point: the description of a hardware wallet for Bitcoin that the company formerly known as Square (which recently changed its name to Block) is developing but won’t officially comment on. One thing we can predict with confidence, though, is that Spectrum readers, familiar with the vicissitudes of technical development work, will understand if some of these projects don’t, in fact, pan out. That’s still okay.

Engineering, like life, is as much about the journey as the destination.

See all stories from our Top Tech 2022 Special Report

Keep Reading ↓ Show less