The computer pioneer who developed the Fortran programming language in the 1950s, launching the software revolution of the modern era, passed away on 17 March at the age of 82, according to numerous media sources. The recipient of the IEEE W. Wallace McDowell Award for technical accomplishment in 1967, John W. Backus died Saturday in Ashland, Ore., according to IBM Corp., where he spent his career. As reported by the Associated Press, Backus famously once said: "Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701 writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."
Photo: IBM CORP.
Backus was born in Philadelphia and grew up in nearby Wilmington, Del., where he was apparently an indifferent student, according to his biographical entry in the Wikipedia. After a stint in the U.S. Army (during which he was treated for a brain tumor), Backus ended up in New York City, where he gravitated toward mathematics. Earning a master's degree in the discipline in 1949, he joined International Business Machines the following year to work on the firm's Selective Sequence Electronic Calculator. The SSEC was one of the last of the large electromechanical computers ever built. It also was one of the first to run a stored program. His first major project was to write the code to calculate positions of the moon.
Weary of the difficulties of hand coding, Backus won permission to assemble a team of programmers to automate the tedious process. The result, after a few years of effort, was the IBM Mathematical Formula Translating System (nicknamed FORTRAN at the time). It enabled programmers to use "high-level" techniques to abstract the elements of a program that could be interpreted as commands a computer could then translate into machine code on its own. Although Fortran (as it came to be known) may not have been the first high-level programming language, it was the first to gain wide adoption in the computer science community, especially for numerical and scientific applications.
Fortran is described as a general-purpose, procedural, imperative programming language. Backus and his colleagues, over the decades, updated Fortran on numerous occasions, each iteration extending the reach of the language as new developments in software technology warranted. Improvements included the addition of support for processing of character-based data, array programming, module-based and object-based programming, and object-oriented and generic programming. The latest edition of the language, Fortran 2003, is a major revision that introduces many new features (for details, visit the ISO Fortran Working Group).
The legacy of Fortran is far reaching. Backus's formalized methodology for interacting with complex computers has never disappeared. Indeed, it is as robust a tool today as ever. It is the primary language for some of the most intensive supercomputing tasks imaginable, such as weather and climate modeling, computational chemistry, quantum chromodynamics, and simulations of solar system dynamics. Astonishingly, even today, half a century later, floating-point benchmarks to gauge the performance of new computer processors are still written in Fortran.
Backus should also be long remembered for, among many other significant contributions to computer science, the Backus-Naur form (BNF), a metasyntax used to express context-free grammars—and a precise way to describe formal languages. BNF is widely used as a notation for the grammars of computer programming languages, instruction sets, and communication protocols, as well as a notation for representing parts of natural language grammars. Most textbooks for programming language theory and/or semantics employ BNF.
Backus spent his entire career working for IBM on various projects in the field of software architecture. In 1987, the company named him an IBM Fellow. In addition to the prestigious McDowell Award, he was also recognized by: the National Science Foundation (on behalf of the U.S. Congress) with the Presidential Medal of Science in 1975; the Association for Computing Machinery with the A.M. Turing Award in 1977; and the National Academy of Engineering with the Charles Stark Draper Prize in 1993; among many other honors. He retired from IBM in 1991 but still kept up participation in the dynamic world of computer science in later years.
News of his passing drew a remarkable response from the software development community at the user-oriented site Slashdot. Younger developers gently mocked the practicality of Fortran in contemporary settings compared with today's far more sophisticated workhorses for writing programs, such as C/C++. Still, there were plenty of self-described "graybeards" who just as gently reminded the current generation of go-getters that there was once a time when Backus's creation was just as much cutting-edge science as the marvels of the present.
One commenter to the online discussion expressed his thoughts on the creator of Fortran with these words: "John Backus was an outstandingly careful and insightful thinker, with a deep understanding of the difference between progress in a line of work and completion of that work. I don't care any more than I think he would have about an appearance of disrespect or lack of appreciation. But I encourage those who reacted superficially to the obituary to look more deeply into Backus's work, and use it as a model of effective thinking."
Newsletter Sign Up
Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.
Finish Line Looms for Google Lunar XPrize Finalists
Five teams, four rockets, and 380,000 kilometers to go
New Paradigm in Microscopy: Atomic Force Microscope on a Chip
MEMS and a new measurement mode shrink AFM technology down to chip scale, creating a much wider market
Graphene Could Buttress Next-Gen Computer Chip Wiring
Current can literally blow copper interconnects away, but graphene could keep them intact
Nanoelectrode Array Sees Signals From Inside a Network of Cells
CMOS-based array could test drugs better by recording from within each cell to map an entire network
Nanorods Emit and Detect Light, Could Lead to Displays That Communicate via Li-Fi
Mobile phones that can see without a camera are among some of the other potential applications
Scientists Measure Single Quantum of Heat
Experimental proof of predictive law of physics could help engineers better manage heat in microprocessors
Graphene Infrared Eye Needs No Signal Amplification
Most sensitive uncooled graphene-based thermal detector yet fabricated
Four IEEE Fellows Share Queen Elizabeth Prize for Digital Cameras
Inventors of the CCD, the pinned photodiode, and the CMOS imager honored with £1 million prize
Self-Healing Transistors for Chip-Scale Starships
A new design could survive the radiation of a 20-year trip to Alpha Centauri
Nanomaterials Enable Smaller Chip Packaging
Novel process moves nanomaterials from smaller transistors to smaller packaging
Flame Retardant in Lithium-ion Batteries Could Quench Fires
Additive could extinguish flames in less than half a second
Molybdenum-Disulfide 2D Transistors Go Ballistic
2D nanomaterial pulls ahead with working registers and latch circuits and devices that let electrons zip through unimpeded
Graphene-Girded Interconnects Could Enable Next-Gen Chips
Current in next-generation chips could literally blow copper interconnects away, but graphene can keep them together
Intel Finds Moore’s Law’s Next Step at 10 Nanometers
In 2017, the company will exploit its manufacturing edge to create a new generation of chips
Completely Artificial Hearts: Coming to a Chest Cavity Near You
For patients with congestive heart failure, mechanical replacements can’t come soon enough
Self-Healing Transistors for Chip-Scale Starships
Transistor design would help "chip ship" survive radiation of 20-year trip to Alpha Centauri
Graphene-based Antenna Still Looking for Path to Commercialization
It does everything a metal antenna can do and more, but it hasn't been adopted yet
First Graphene Photodetector To Operate in the Microwave
New photodetector is 100,000 times as sensitive to light as previous graphene photodetectors
Germanium Can Take Transistors Where Silicon Can’t
The material inside the first transistors could have a new life at the cutting edge
Beyond Touch: Tomorrow’s Devices Will Use MEMS Ultrasound to Hear Your Gestures
Touch screens are on the way out; piezoelectric gesture control is on the way in