The Texas Instruments 99/4: World’s First 16-Bit Home Computer

How a desire for corporate synergy gave rise to the “embarrassing” TI 99/4

4 min read
TI 994

In the late 1970s, personal computers were starting to take off, with plenty of options from an array of companies. The Altair 8800 was an early entrant, popular among techie users and developers. Others, including Radio Shack's TRS-80 (referred to by its critics as the Trash 80), the Apple I and II, and the Commodore PET, sold to a more mainstream set of customers.

Industry watchers expected TI to have a big impact on the market—if it managed to introduce a competitive product. By 1977, TI engineers were hard at work on not just one computer design but three: 1) a low-end video game console intended to sell for under US $200, with games stored on ROM cartridges; 2) a personal computer aimed at hobbyists and users of Radio Shack and Apple computers; and 3) a high-end business computer with nearly 10 megabytes of hard disk storage.

As tends to happen in large companies, turf wars broke out. The boundary between the video game and home computer groups, both of which were competing for resources in TI's facility in Lubbock, Texas, began to grow fuzzy. Meanwhile, the business computer's intended market clearly overlapped with that of the minicomputer and portable terminal products, which came out of TI's Data Systems Division. Ultimately, the video game and personal computer efforts merged, and the business computer was moved to the Data Systems Division, where it was viewed with hostility and subsequently killed.

The surviving home computer group, whose product was now known as the 99/4, started receiving lots of corporate “help," including pressure to use TI's troubled 16-bit TMS9900 microprocessor. Peter van Cuylenburg, who was in TI's European marketing division, came up with an innovative solution to this corporate pressure. He commissioned a third-party contractor to design and build a personal computer, code-named “Mojo," using only chips made by TI. The contractor used TI's version of the Intel 8080A in addition to lots of TI TTL, memory, and other components.

While Mojo sort of solved the desire for corporate synergy between the company's semiconductor and consumer products groups, the TMS9900 was more genuinely a TI microprocessor. And so after a series of discussions and compromises, the Mojo idea was discarded, and the 99/4 ended up using the TMS9900 as well as the TMS9918 graphics chip. The 99/4 also ended up with a combination of features from the original video game design (including an ultracheap keyboard, RF clips to attach to a TV set, and ROM-based applications).

When the 99/4 finally emerged in 1979, having somehow transcended three sets of TI management, it received mostly disparaging reviews. Benjamin M. Rosen of Morgan Stanley, who was then the leading semiconductor analyst and fast becoming the leading personal computer analyst as well, wrote a humorous article about the disappointment of what was expected to be a major contender in the personal computer business. A revised version, the TI-99/4A was released in 1981 with a better keyboard, but it wasn't enough: In 1983, the New York Times called the 99/4 “an embarrassing failure."

TI 994 Embarrassing Failure: Texas Instruments had high hopes for its TI 99/4, which debuted in 1979, and the follow-on 99/4A (shown here). But the machines performed poorly and the company took a $330 million write-off before exiting the home computer market. Photo: Robert Clay/Alamy

Despite being the world's first 16-bit home computer, the 99/4, like the microprocessor on which it was based, was a dog. Intended to satisfy two very different markets, the 99/4 and 99/4A served neither one well. Unsurprisingly, initial sales were weak. By 1982 Commodore had initiated a price war with its VIC-20 computer, forcing TI to first offer $100 rebates and eventually slash the price from an initial $1,150 to an unsustainable $49. The price war was an incredible boon to consumers, because it made personal computers affordable to a much wider market. TI saw sales of the 99/4A explode, eventually reaching 2.8 million units.

But the 99/4 and 99/4A were expensive to build, and the company lost money on each home computer it sold. One reason they cost so much to build was the high cost of the TMS9900's unconventional packaging. Specifically, the large number of connecting pins for a 16-bit microprocessor forced TI to develop its own packaging technology. By the time others entered the 16-bit microprocessor market, standard packaging configurations had been developed to overcome this problem.

graphic link to related The Inside Story of Texas Instruments' Biggest Blunder: The TMS9900 Microprocessor

The 9900 also had some performance issues because of its unique “memory to memory" architecture, which used off-chip memory instead of on-chip registers, thus slowing performance. The lower-cost TMS9995 chip that followed the TMS9900 overcame some of these problems with an on-chip register file cache, but there was no easy way to substitute the TMS9995 for the TMS9900 in the TI-99/4 architecture.

Meanwhile, most third-party software developers remained reluctant to invest in programs for the 99/4A. As demand for the 99/4A soared, TI initiated a crash program to ramp up production. But it couldn't compete with Commodore, which had a low-cost design and continued to lower the prices for its computers even further. Just as TI's production output took off, demand for the 99/4A began to fade. In 1983 the company took a massive write-off of $330 million, and it abandoned the home computer business the following year. Ironically, by then the TI machine had a devoted following among users who had purchased the 99/4A because they couldn't resist its low price. Remnants of that cult still exist today.

Interestingly, the TMS9918 graphics controller used in the 99/4 series found broad usage in the MSX standard for home computers, which also used a Zilog Z80 microprocessor and the CP/M operating system from Digital Research. The MSX architecture was the brainchild of Kazuhiko Nishi, who founded a publishing company called ASCII Corp., which later became ASCII Microsoft. The TMS9918 graphics chip, designed by TI's Karl Guttag, was central to the MSX standard and eventually was used in products from game manufacturers like Coleco and Sega, as well as Sony, Yamaha, and Toshiba. Had the IBM PC never come along, the MSX standard might have been much more broadly adopted.

The Conversation (2)
Joan Verdaguer Codina 02 Aug, 2021
M

A clarification,

In Catalonia, Joan is a male name translated as John in English.

Joan Verdaguer Codina 02 Aug, 2021
M

The point of view of these articles (The Inside Story of Texas Instruments' Biggest Blunder: The TMS9900 Microprocessor and this) is that of a director of a company, a corporate point of view, a vision of what lives outside of end users, and users are the ones who make a product a success or failure. There are technological products that are a success thanks to women, they for various reasons are often invisible in technological processes, however they are decisive in the success or failure of a product, e.g. mobiles became an essential product when mothers saw that they were allowed to control their children and husbands. I later will explain its influence on the success of the IBM-PC. This essay is written from the perspective of an era as an electrical engineering student in the 1970s, and as a personal computer user from the 1980s onwards. In the 80's the company NEC Europa created an award and I participated in an article entitled "Darwin's law in electronics ?. Most of the ideas outlined below come from that essay. I don't show it because my English back then less good than is now; English is my fourth language. The main idea of that essay was that many companies had lost their market because they wanted to change the environment when it was not the time to make non-evolutionary leaps. We must not forget that Peter Drucker warned companies that they had to adapt to a changing environment, and must not forget that Darwin wrote that the animals that survived were the ones that adapted best in their environment. In the essay "Darwin's law in electronics? I coined the phrase "evolution and revolution." These two words are essential to understanding the evolution of any market that was new, completely changing and that modified the environment to which all the managers of the companies were accustomed. At the time of the birth of the Intel 4004 microprocessor for civilian use, two markets were created: that of computers, and that of computers. The microprocessor changed the environment, breaking the oligopoly of computer manufacturers IBM, Digital, NCR, HP and a few more, creating new brands with new products. The breakdown was threefold, before companies owned hardware and software, from that time the manufacturer of CPUs was not the manufacturer of the computers, and neither was exclusively the manufacturer of the software. The world of calculators was also a new market in which initially there were HP, TI, Commodore, Sinclair, but in the end only HP and TI remained. It must remembered that Hewlett, vice president of the company, decided against the opinion of those (marketing) in the company who said that the programmable calculator had no future. Hewlett was an engineer and as such at his desk had: the slide rule, trigonometric conversion table books, Imperial System unit conversion books into the international metric system, and he saw that everything was condensed into the HP-35, the first generation of programmable pocket calculators. This was a revolution because it digitized everything that was done on paper. In the second generation of calculators the HP25C competed with the TI58. Here is the difference between "evolution and revolution." The TI58 incorporated as an option a mathematical coprocessor. The 8087 was the coprocessor for the cpu 8088/8086 and evolved to the 80386 because the cpu 80486 already carried it built-in. The TI58 incorporated what already existed, the encapsulation was different for another environment; it was simply an evolution. The HP25C incorporated Continuous Memory technology that allowed you to save a written program when you turned off the calculator. For students of my time, this was very important and was a feature that the TI58 did not have despite having many more program steps. Continuous Memory was a revolution. It is common to us now, but in the 70s it was a novelty. While Hewlett-Packard took an evolutionary step with the HP-41 an alphanumeric calculator. Texas Instruments moved from the TI59 to the TI99/4 desktop computer; for a student in the 70s this decision made no sense. Those of us who studied engineering in Catalonia at the time were clear that we could not go to classes or exams with a personal computer. I remember that in Catalonia in a short time TI lost its computer market and its place was taken by the Japanese companies. HP calculators have always been famous in engineering schools in Catalonia, while in other places it has not been so. I can't prove it but I have the feeling that postfix notation (RPN) is rooted in Catalonia has to do with the Catalan language. Computing in schools began in the 1980s, Texas Instruments failed to present an attractive product to the consumer to compete with IBM, Apple, or the BBC micro. What went wrong inside TI? I think several causes led to it. For me, one of the reasons is that they lived isolated from the environment, the other an excess of pride. In fact, it is difficult to understand that Texas Instruments did not manufacture the first microprocessor when it had manufactured the first integrated circuit (hereinafter IC) and also the first IC (SN504 flip-flop) that went into space with the Explorer 18 satellite, of which in these dates celebrate the 60th anniversary. Isolation and pride made Texas Instruments forget the basic principle of selling: "the difference between children and adults is in the price of toys". TI failed to present TI99/4 as a sweet product for teenagers and their parents. In CPUs there was a similar evolution; initially there were many manufacturers until it was reduced to a few. In fact, tthe CPUs market was highly conditioned on the evolution of the computer market. The personal computer market evolved as follows: mid-70s to mid-80s 8-bit computers; mid-80s to late 90s 16-bit computers; the first decade of this century had to be 32-bit that lasted until the middle of the last decade, and the last decade had to be all 64-bit that is still being worked on. The evolution to 64-bit was made possible by a breakthrough made in TI laboratories. The forecast for the early 1990s was that in 2010 an integration density of 18 microns would be reached, which would mean integrating 90 million transistors, and by 2020 the integration density of 1 micron would be reached, thus opening the door to nanotechnology for the beginning of the current decade. As I mentioned in the previous paragraph TI laboratories broke this evolutionary prediction; thus, in 1996 TI managed to integrate 18 micron transistors, and thanks to this chasm three years later the Lawrence Livermore Laboratory was able to integrate 1 micron transistors thus opening the doors to nanotechnology at the end of the last century. Later I explain the social implications of these advances. Returning to microprocessors Intel made a natural evolution, from 4-bit, to 8-bit. Although the 8086 could work as a 16-bit, it never did so until the arrival of the 80286 cpu, which was popularized with the IBM-AT. The madness of jumping to 32 bits without having worked the 16 bits affected all manufacturers without exception. In my essay I wrote the section "the rise and fall of Zilog" how Zilog lost the market by releasing a 32-bit cpu; Intel also fell into the trap when trying to manufacture the i432, although he learned a lot from that mistake; Motorola went from 6800 to 68000 in a completely new structure that cost it a titanic financial effort asking IBM for help. Other companies abandoned the manufacture of CPUs. The cpu 68000 was a revolution, because it was designed to work with Unix, and the only computer manufacturer that understood it was Thompson, but at the time Unix was prohibitive. Not having a continuous environment was the cause of the disappearance of many CPU manufacturers and consequently of computer manufacturers. Sinclair was the company that worked hardest to popularize computing, going from Z80 to 68000 it lost the market, the same happened to Crommemco, Osborne, among other companies. Apple was the first personal computer, it had a good financial cushion that allowed it to survive, in this way it achieved the objective of changing the user's work environment by presenting its product in a completely different way, for a very specific market and preserving the old model of oligopolies, which is to control hardware and software. In fact Apple had already made two revolutions, with the name, with the power supply, and an evolution in presenting an integrated product. In my essay I explain that the name is important, Americans are eager for apple pie, and part of Apple's success is due to its name; other companies tried to copy Cherry, Melon, and so on. but they did not succeed. The success of the IBM-PC was due to the secretaries. As my wife says, secretaries have often saved a third world war. In fact, the IBM-PC was designed for the senior business management and the sales forecast for the first year was 250 k-units, IBM sold 15 M-units. Secretaries are used to dealing with men with a huge ego, mostly psychopaths, and loaded with testosterone; for them it was easy: "the boss of a colleague who works in company B has bought her an IBM computer, she prints the copies or corrections instantly while I have to use this typewriter, make copies with coal paper and make photocopies ". You need to put that lyrics in your ear with the right tone, and hence the success of the IBM PC. Microsoft was able to apply Moore's Law to its business by offering hardware manufacturers that computers should be replaced every 3 years. The key play was, each new OS required more RAM and more hard disc otherwise it gave problems, e.g. Windows 3.1 in theory could run with 1MB, the reality needed 4MB; the W-95 needed 16 MB; W-98 required 64 MB; W-XP although it could work with 1 GB actually required 4 GB and so on until the W-10 which in theory could work with 4 GB but requires 8 GB; now with the W-11 being a Linux based OS as it cannot play the same game then it attacks the CPU, W-11 will operate from the eighth generation of Intel CPUs. This "artificial evolution" along with the alliance of the largest CPU manufacturer (Intel) is the key to its success. AMD has been the big driver of Intel because it always offered improved CPUs as a second manufacturer. Intel with the 80486 was compared to Superman and did not want to give AMD a choice with the Pentium. AMD engineers then said that kryptonite is Superman's weak point and designed a cpu to compete by calling it K-5 (the letter k is not a whim), and so on until AMD made the qualitative leap with the Athlon 64. Superman's reasoning and kryptonite is not a license of mine, it really happened. Peter Drucker explained that society was changing and that computers are helping in these changes, and this prediction was a reality because microprocessors caused the birth and death of many companies; Digital disappeared and with it its new cpu Alpha; Mentor Graphics had a good market in printed circuit design and lost part of that market because there were cheaper programs running on PC, e.g. Tango. Others born when microprocessors were 8- and 16-bit , such as Apollo and Sun, disappeared when the 32- and 64-bit market entered mass consumption. Broadly speaking, this has been the evolution of calculators and computers from the point of view of an end user. Finally, I wrote before that I would talk about the social implications, so TI at 18 microns compressed 14 years the natural technological evolution, and when the Lawrence Livermore Laboratory reached 1 micron three years later it compressed 20 years the natural technological evolution. The social implications are huge, what was expected to start around 2040 started 30 or 40 years earlier, nanotechnology was due to start in 2020 and 64-bit CPUs from 2010, it should be remembered that AMD introduced the Athlon 64 for mass consumption earlier this century with 113 M-transistors. These advances in computer science led to a chain reaction of advances in other fields with their disasters such as the economic crisis, because the mentality of many leaders and the majority of the population was not prepared for these changes. Technological evolution must be accompanied by mental maturity; to give an example, Dr. Struth at the November 2015 IT summit in Berlin explained the problems the company had in Industry 4.0 and six years later they have yet to be resolved. Read the report (Acatech STUDY: Industry 4.0 Maturity Index [Update 2020]) and the bottleneck that represents the third phase. Dr. Joan Verdaguer-Codina IEEE member # 02545127

Study: Recycled Lithium Batteries as Good as Newly Mined

Cathodes made with novel direct-recycling beat commercial materials

3 min read
iStockphoto

Lithium-ion batteries, with their use of riskily mined metals, tarnish the green image of EVs. Recycling to recover those valuable metals would minimize the social and environmental impact of mining, keep millions of tons of batteries from landfills, and cut the energy use and emissions created from making batteries.

But while the EV battery recycling industry is starting to take off, getting carmakers to use recycled materials remains a hard sell. "In general, people's impression is that recycled material is not as good as virgin material," says Yan Wang, a professor of mechanical engineering at Worcester Polytechnic Institute. "Battery companies still hesitate to use recycled material in their batteries."

Keep Reading ↓ Show less

New Optical Switch up to 1000x Faster Than Transistors

“Optical accelerator” devices could one day soon turbocharge tailored applications

2 min read

The Hybrid Photonics Labs at the Skolkovo Institute of Science and Technology in Moscow, where the new optical switch was created.

Skoltech

A new optical switch is, at 1 trillion operations per second, between 100 and 1,000 times faster than today's leading commercial electronic transistors, research that may one day help lead to a new generation of computers based on light instead of electricity, say scientists in Russia and at IBM.

Computers typically represent data as ones and zeroes by switching transistors between one electric state and the other. Optical computers that replace conventional transistors with optical switches could theoretically operate more quickly than regular computers, as photons travel at the speed of light, while electrons, typically, don’t.

Keep Reading ↓ Show less

Trending Stories

The most-read stories on IEEE Spectrum right now