IEEE Spectrum isn’t the only thing celebrating its 50th anniversary this year. On this day in 1964, the first software written in Basic was successfully run on a GE-225 mainframe at Dartmouth College. As critical a moment as that was to the history of computing, I want to skip ahead twenty years and talk about what Basic meant to a generation of neophyte coders in the 1980s, of which I was one.
But in the 1980s, most kids didn’t have access to the Internet, integrated development environments, rich graphics, or even a choice of languages. What we had were 8-bit home computers, a blinking cursor, and Basic.
And it was wonderful. God, it really was.
Well, at least to those of us who persisted. Let’s not kid ourselves in a haze of nostalgia—there are very good reasons why things like Scratch and Processing were created, the same reasons why many, if not most, of those 8-bit machines wound up being used solely to play games. Tapping out Basic programs often meant a lot of effort with nothing to show for it other than that Great Sphinx of computer messages: “SYNTAX ERROR.”
But for those of us who did stick with it, Basic opened the door to something that had never happened before—a generation of children and teens programming general-purpose computers. We may not have been online (much) but we were the first digital generation, absorbing the Tao of pixels, bytes, and FOR… NEXT loops alongside other important lessons, such as the names of various national capitals and how to minimize zits.
I remember my first program, by which I mean one that I cobbled together myself, not simply typing in a complete listing from the manual. I was twelve, the year was 1985, and the computer a Texas Instruments TI-99/4A (a machine which was actually the first 16-bit home computer). My program was a very simple text adventure game, created by chaining together as many IF… THEN GOTO statements as I had patience for.
Even to my own taste, the game was terrible, with instant death lurking beyond almost every wrong choice. What changed the world for me was a bit at the end, where I rewarded a winning player with a rising scale of musical notes, generated by a FOR… NEXT loop wrapped around a SOUND statement. Despite after-school classes, I had failed to master even the simplest of musical instruments due to my lack of interest in practicing or, before long, attending the classes. But now I could create “music” to order with three lines of code. I didn’t even have to program in individual notes—I could tell the computer to work out the notes for me. I was sold.
A notable feature of this era was the extent to which programming was in the air. Everyone, from politicians and celebrities on the TV right down to your own parents, was declaring that microcomputers were the future. It was practically a moral imperative that families should furnish their children with a home computer for educational purposes. Computer labs (and computer clubs) appeared in schools for the first time.
To meet the demand fostered by millions of anxious parents, an enormous menagerie of companies sprung up around the world in the early 1980s, resulting in a sort of Cambrian explosion of home computers. The Apple II made the first big splash of course, at least in America, but it was rapidly joined by such fondly-remembered competitors as the TRS-80, the Commodore 64, the Sinclair ZX Spectrum, and the Dragon 32, not to mention some of the more obscure also-rans, such as the Sord M-5, the Oric-1, and the Coleco Adam.
They all came with Basic, usually on a built-in ROM chip.
And with this lingua franca to hand, television programs stepped viewers through simple programs. Shelves were stocked with magazines bulging with programming tips and tricks along with listings of Basic programs for readers to type in. There was even a line of young-adult adventure books that included type-in Basic programs as part of the narrative.
However, there was a flaw in this 1980s Basic paradise. Even though most home computers were built around one of two CPUs (either a Zilog Z80 or a MOS Technology 6502) and all ran Basic, the maddening fact was that each machine had its own particular dialect of the language. Amendments and alterations were made to the syntax to address machine-specific details related to memory, graphics, external storage, and the mental state of the developers as release dates loomed.
The result was that two computers often had different commands for, say, plotting a pixel on screen or ways of handling floating-point numbers. Magazine publishers trying to cater to the largest possible market would often include “conversion boxes” at the end of each printed Basic program, listing the alterations required to make a program run on several of the more popular computers. If the manufacturer of your computer happened to be a loser in the brutally Darwinian environment of the mid-1980s computer market, you were left to figure it out on your own, something that was mostly deeply frustrating but occasionally deeply enlightening.
The quality of these manufacturers' computers, and their implementations of Basic, varied enormously. I was lucky enough to wind up doing the bulk of my teenage programming on a machine that, while not well known outside the United Kingdom and Ireland, was noted as being among the best on both counts: Acorn’s BBC Micro.
The BBC was released in 1981 as part of a national computer literacy campaign in Britain. Between the tight standards insisted upon by the British Broadcasting Corporation and the immense skills of the developers (many of whom would go on to create the ARM mobile processor architecture now running in 95 percent of the world’s smartphones) the result was a sturdy workhorse that could still be found usefully employed in odd corners well into the 1990s. The BBC Micro’s Basic interpreter offered a number of advanced features for structuring programs and accessing the computer’s hardware. Acorn’s exceptionally well-written user manual and programming guide became something of a Bible for my teen years, alongside the diaries of Adrian Mole.
But best of all, the BBC Micro’s Basic supported an inline assembler, allowing you to mix and match blocks of machine code with Basic statements. It was a tremendously easy way to get started with programming on the bare metal. I found this out when, returning to my inclination to reduce musical effort, I entered a pure Basic program that read the keyboard as if it was a piano into an annual computer fair. The program would emit the corresponding note, and plot it on a blank onscreen musical staff. The user could then play back their composition while looking at their score. It was a nice idea, but it suffered from the fatal flaw of being incredibly slow. Press a key, and it would take several seconds to update the screen.
Despite a kind reception, I was somewhat embarrassed by its performance. So I dived into the user manual and returned the next year with all the critical loops converted to assembly. Blam. Real-time responsiveness (as long as you didn’t play too fast.)
That was the beginning of my growth beyond Basic. A few months later I would get hooked up with a machine running the Forth language, and be faced with the icy cold shock of abandoning line numbers and GOTO commands altogether. Around the same time, many manufacturers were going out of business, as general purpose home computers were replaced in the late 1980s by game consoles for those interested only in entertainment, or by PCs and Macs for those looking for more serious applications.
None of these replacements were easily programmable. And while cheap and easy ways to code would eventually become generally available again for children and teens, learning how to program would never regain the place in popular culture it held in the 1980s. The Golden Age of Basic was over.
But it will be fondly remembered by a generation of geeks. What were some of your 8-bit Basic war stories? Let us know in the comments below!