The computer pioneer who developed the Fortran programming language in the 1950s, launching the software revolution of the modern era, passed away on 17 March at the age of 82, according to numerous media sources. The recipient of the IEEE W. Wallace McDowell Award for technical accomplishment in 1967, John W. Backus died Saturday in Ashland, Ore., according to IBM Corp., where he spent his career. As reported by the Associated Press, Backus famously once said: "Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701 writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."
Photo: IBM CORP.
Backus was born in Philadelphia and grew up in nearby Wilmington, Del., where he was apparently an indifferent student, according to his biographical entry in the Wikipedia. After a stint in the U.S. Army (during which he was treated for a brain tumor), Backus ended up in New York City, where he gravitated toward mathematics. Earning a master's degree in the discipline in 1949, he joined International Business Machines the following year to work on the firm's Selective Sequence Electronic Calculator. The SSEC was one of the last of the large electromechanical computers ever built. It also was one of the first to run a stored program. His first major project was to write the code to calculate positions of the moon.
Weary of the difficulties of hand coding, Backus won permission to assemble a team of programmers to automate the tedious process. The result, after a few years of effort, was the IBM Mathematical Formula Translating System (nicknamed FORTRAN at the time). It enabled programmers to use "high-level" techniques to abstract the elements of a program that could be interpreted as commands a computer could then translate into machine code on its own. Although Fortran (as it came to be known) may not have been the first high-level programming language, it was the first to gain wide adoption in the computer science community, especially for numerical and scientific applications.
Fortran is described as a general-purpose, procedural, imperative programming language. Backus and his colleagues, over the decades, updated Fortran on numerous occasions, each iteration extending the reach of the language as new developments in software technology warranted. Improvements included the addition of support for processing of character-based data, array programming, module-based and object-based programming, and object-oriented and generic programming. The latest edition of the language, Fortran 2003, is a major revision that introduces many new features (for details, visit the ISO Fortran Working Group).
The legacy of Fortran is far reaching. Backus's formalized methodology for interacting with complex computers has never disappeared. Indeed, it is as robust a tool today as ever. It is the primary language for some of the most intensive supercomputing tasks imaginable, such as weather and climate modeling, computational chemistry, quantum chromodynamics, and simulations of solar system dynamics. Astonishingly, even today, half a century later, floating-point benchmarks to gauge the performance of new computer processors are still written in Fortran.
Backus should also be long remembered for, among many other significant contributions to computer science, the Backus-Naur form (BNF), a metasyntax used to express context-free grammars—and a precise way to describe formal languages. BNF is widely used as a notation for the grammars of computer programming languages, instruction sets, and communication protocols, as well as a notation for representing parts of natural language grammars. Most textbooks for programming language theory and/or semantics employ BNF.
Backus spent his entire career working for IBM on various projects in the field of software architecture. In 1987, the company named him an IBM Fellow. In addition to the prestigious McDowell Award, he was also recognized by: the National Science Foundation (on behalf of the U.S. Congress) with the Presidential Medal of Science in 1975; the Association for Computing Machinery with the A.M. Turing Award in 1977; and the National Academy of Engineering with the Charles Stark Draper Prize in 1993; among many other honors. He retired from IBM in 1991 but still kept up participation in the dynamic world of computer science in later years.
News of his passing drew a remarkable response from the software development community at the user-oriented site Slashdot. Younger developers gently mocked the practicality of Fortran in contemporary settings compared with today's far more sophisticated workhorses for writing programs, such as C/C++. Still, there were plenty of self-described "graybeards" who just as gently reminded the current generation of go-getters that there was once a time when Backus's creation was just as much cutting-edge science as the marvels of the present.
One commenter to the online discussion expressed his thoughts on the creator of Fortran with these words: "John Backus was an outstandingly careful and insightful thinker, with a deep understanding of the difference between progress in a line of work and completion of that work. I don't care any more than I think he would have about an appearance of disrespect or lack of appreciation. But I encourage those who reacted superficially to the obituary to look more deeply into Backus's work, and use it as a model of effective thinking."
Newsletter Sign Up
Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.
The Quest for the Ultimate Vacuum Tube
The cold-cathode traveling-wave tube, an ultracompact, ultraefficient source of RF waves, may finally be within reach
Ultra-Sensitive Magnetic Sensors Don't Need Ultra Cold
New SQUID arrays take advantage of strength in numbers
Nanoscale Photodetector Promises Next Generation Photonic Circuits
Device using a silver nanowire can produce a current from light
Tunnel Transistor May Meet Power Needs of Future Chips
The new transistor consumes 90 percent less power than conventional devices
Oxynitride Thin-film Transistors: Faster Screens with Faster Electrons
Will faster transistors revolutionize video?
Building Ultra-Energy-Efficient Computers Out of Tiny Bar Magnets
Nanomagnet computers could consume one-tenth the power of today’s microprocessors
Simple Device Could Convert DC Electric Field To Terahertz Radiation
Physicists propose a simple device design that emits tunable THz radiation when a DC current passes through it
The First White Laser
Lasers that can span the entire color spectrum could speed up Li-Fi and find use in lighting and displays
Intel Hits Snag On The Way To Next-Generation Chips
Moore's Law's steady cadence hits another hiccup. Will more follow?
"Valleytronics" Development Could Lead to New Approaches for Spintronics and Quantum Computing
Researchers discover that valley polarization makes electron spin polariation in silicon transistors easier
Molecular Electronics Takes Large Stride Forward
Simple process improves single-molecule diodes performance by 50 times
Five Things You Might Not Know About Moore’s Law
Facts that are often overlooked when Moore’s Law is discussed
The Murky Origins of “Moore's Law”
A hunt for the very first time the term was used
Soggy Computing: Liquid Devices Might Match the Brain's Efficiency
Vanadium dioxide switches could be great for a new kind of computing, but maybe they're just too strange
How Much Did Early Transistors Cost?
About a billion times more than they do now
Q&A: Carver Mead
A longtime collaborator recalls the first time he met Gordon Moore
Perovskite Leads to 100-Percent Efficient Nanowire Lasers
The quality that makes perovskite attractive for photovoltaics makes them near perfect for lasers
What Kind of Thing Is Moore’s Law?
The trend has more to do with collective behavior than the laws of nature
The Simplest Flexible Printed Transistors
Cheap disposable electronics from super-simple amorphous oxide transistor made from just two materials
Special Report: 50 Years of Moore's Law
The glorious history and inevitable decline of one of technology’s greatest winning streaks