Write and Wrong

IEEE Spectrum has brilliantly predicted major tech breakthroughs and occasionally mistaken dross for gold. Here's a look at some hits and misses from our first 40 years

1967 -- Semiconductors

The Cryogenic Chip

At a time when a typical computer had no more than a few kilobytes of random-access memory, a technology based on superconductor thin films promised a thousandfold boost in capacity. There was just one catch: these memories had to operate at temperatures of -269.6 °C (3.5 K), raising the distinct possibility that any computer outfitted with them would also have to be the world's coldest freezer. That fact doomed "cryoelectrics," as the authors of a June 1967 article called the technology.

"Cryogenic random-access memory is a very real contender for large-capacity memory applications and deserves the attention of computer development engineers."

Despite their unabashed view that "cryogenic random-access memory is a very real contender for large-capacity memory applications and deserves the attention of computer development engineers," these cold memories are now on ice.

1967 -- Transportation

Electric Auto Boogaloo

Around the 1890s, the few people who were in the market for an automobile had their choice of not only gasoline-powered vehicles but also fully electric ones. But by the early 20th century, such advances as self-starting engines pushed the gasoline car--faster, lighter, cheaper, and with greater range--far ahead and left the competition inhaling its smoky exhaust.

Since then, the electric car has had more comebacks than Donald Trump, and the pages of IEEE Spectrum are rife with them. As early as April 1967, we wrote about the air pollution problems of gasoline cars and welcomed the prospect of their demise. "Now it is time to redress the imbalance caused by this Frankenstein's invention," wrote staff writer Nilo Lindgren, "and bring back the electric car." We optimistically concluded: "The question now is not 'Will they be developed?' but 'How soon?' "

We were right--sort of. Lately, after the briefest of resurgences in the late 1990s, the electric vehicle has faded again. Superseding it, for now, anyway, are hybrid electrics, powered by both batteries and gasoline. We described several of them in a March 2004 article. Meanwhile, many analysts expect yet another comeback for pure electric cars, but they predict that these vehicles will be based on fuel cells rather than batteries.

1969 -- Nuclear Energy

When Good Breeding Isn'T Enough

At the dawn of the nuclear age, engineers and physicists thought conventional nuclear reactors would be a brief-lived stepping stone to a more sophisticated kind of fission reactor, called a breeder reactor.

Like their conventional counterparts, breeders contain a core of fissionable material--generally uranium-235 or plutonium--in which energy is generated. But the core is surrounded by a blanket of uranium-238, which captures neutrons from the fission reaction at the core to produce plutonium-239. The idea is for the reactor to produce more new fuel, in the form of plutonium, than it burns--perhaps 20 or 25 times as much.

No wonder, then, that three Westinghouse Corp. engineers asserted in our March 1969 issue: "Breeder reactors are expected to play an increasingly important part in the overall energy generation picture in the United States, particularly after the mid-1980s."

It didn't happen. Breeder reactors, most of them experimental, were built in France, India, Japan, Russia, and the United States. Many of these reactors proved costly and failure-prone. Today, only one large breeder reactor, in Beloyarsk, Russia, generates electricity regularly. That 600-megawatt unit may be replaced by a new 800-MW design by 2009. And in India, engineers broke ground this past August for a 500-MW breeder reactor in Kalpakkam, 50 kilometers from Chennai.

1969 -- The Internet

There At The Creation

Here's an idea: use the telecommunications system to connect widely scattered computer networks so that individual computers can exchange data. It seems so obvious today, now that almost every network is connected to the Internet. But in 1969, Spectrum readers would have been among a very select group of people to know that the first such "internetwork" was about to go online.

In August, we reported: "Computers of different makes and using different machine languages will be linked together into one time-sharing system. The University of California, Los Angeles, will become the first station in the nationwide computer network. The system will, in effect, pool the computer power, programs, and specialized know-how of about 15 com-puter research centers stretching from U.C.L.A. to M.I.T. The first stage of the network will go into operation this fall as a subnet joining U.C.L.A.; Stanford Research Institute; University of California, Santa Barbara; and the University of Utah.... Each computer in the network will be equipped with its own interface message processor, which will double as a sort of translator among the Babel of computer languages and as a message handler and router."

That first internet, which became the Arpanet, and eventually the Internet, was born a month later, when scientists at Bolt, Beranek, and Newman Inc. in Cambridge, Mass., first sent a message to their counterparts at the University of California, Los Angeles.

1971 -- Semiconductors

O'Er The Ramparts We Watched

Semiconductor memories are now a cornerstone of the global electronics industry. But 33 years ago, these chips, which then had thousands of transistors that each stored a bit, were a novelty. Semiconductor RAM was a fledgling technology that had to carve out market share against much more established ones, like ferrite cores. Would it succeed? Intel cofounder Andrew S. Grove and two of his colleagues told our readers it would, in a June 1971 article.

They wrote: "Random-access read-write memories typify the revolution in computer memories--storage elements that promise to replace ferrite cores and plated wires in the seventies." So they did, and not just for computers but for automobiles, DVD players, PDAs, cellphones, and just about everywhere else microprocessors call home.

1978 -- Biotechnology

The Perfect Tomato

It all began in the early 1970s when a group of engineers at General Electric Co., in Fairfield, Conn., having heard of some NASA research on plant growth in outer space, thought they could use specially controlled environments to grow better fruits and vegetables. The idea sprouted, and in 1973 Geniponics was born.

A high-tech version of hydroponics--growing plants in liquid nutrient solutions rather than soil--Geniponics was "GE's answer to the search for reasonably priced and perfect vegetables," we wrote in January 1978. In air-conditioned, pest-free chambers with sensors that precisely regulated light intensity, temperature, humidity, carbon dioxide levels, and flow of nutrients, GE began to grow tomatoes, lettuce, cucumbers, eggplant, onions, even fruits and medicinal plants.

By 1977, we reported, the company was harvesting annual crops of more than 200 kilograms of tomatoes per square meter, four times the yield of standard hydroponics and 30 times that of the average farmer--with a bonus of 30 percent more vitamin C. "The profits and social benefits of these engineered plants and vegetables seem enormous," we wrote.

But it turned out that growing the enhanced vegetables and fruits consumed so much electricity the system was uneconomical. In 1980, GE sold the operation to Control Data Corp. of Minneapolis, Minn., which shut down the Geniponics facility a few years later.

1979 -- Space Weapons

May The Force Not Be With Us

A constellation of satellites, zapping enemy nuclear missiles with powerful laser beams: that was the goal--or at least the image in the public's mind--of U.S. President Ronald Reagan's Strategic Defense Initiative, commonly known as Star Wars. As early as June 1979, Spectrum reported on the controversial topic.

Critics argued that the lasers could be easily reflected by protective coatings and the sensors fooled by cheap decoys. Researchers began to work on other kinds of weapons, such as guns that would shoot streams of electrons or protons, a scheme we described as "highly questionable." In a September 1981 article, we wrote: "Despite continuing research, the feasibility of building particle-beam weapons and the wisdom of using them remain highly uncertain."

With the Star Wars project gaining momentum in the mid-1980s, Spectrum ran a 31-page cover story in September 1985. We presented both sides of the issue, zeroing in on the technical difficulties that ultimately rendered the project, as it was initially--and grandly--conceived, unfeasible. In a section titled "Mind-boggling Complexity," a back-of-the-envelope calculation showed that in a nuclear confrontation with the Soviet Union, even with three defense layers each shooting down 90 percent of 14 000 warheads, 14 would get through, killing perhaps tens of millions of people.

It was an efficacy that some considered acceptable and others utterly inadequate. In follow-up, in January 1987, we wrote that a special committee, created to analyze the complexity of the software challenge, found that "writing such a single enormous program correctly would be impossible."

Spanning The Decades -- Transportation

Maglev--just A Few Years Down The Track

Too near to go by plane and too far for a train?" asked an April 1973 article. "The answer could be magnetic levitation and propulsion." The article, "Flying Low with Maglev," went on to explain the economic and technological advantages of using magnetic forces for suspending and propelling trains at high speeds--a new transportation system that, we said, was "on the horizon."

Of course, as you keep approaching the horizon, it keeps receding from you. That proved to be the case with maglev trains and Spectrum. An August 1984 article speculated about a maglev running at 15 000 kilometers per hour in an evacuated tunnel that would traverse the entire United States in half an hour. In August 2002 we wrote about one in China running at 430 km/h. And in January 2004, a two-page photo featured a Japanese research maglev that hit 581 km/h, breaking the world speed record for a train.

Maglevs have an ability to engender fascination, especially in magazine editors, far beyond their commercial prospects. The truth is that for a variety of reasons--high costs, complexity, political decisions--magnetically levitated trains for urban and intercity transportation have never taken off--in more ways than one. A few operate in Japan, China, and Europe, but not as many as proponents hoped, especially because of competition with standard high-speed rail. In any case, as other maglevs begin to levitate, you can be sure of finding them in our pages.

1979 -- Consumer Electronics

The Little Disc That Could

In the early 1970s, it seemed clear that the vinyl music album would be superseded--by magnetic disks and tapes. Clear to everyone, that is, except a group of engineers working at Philips NV, in the Netherlands, which had seized on the idea of recording information on optical discs. Though they were trying to develop a medium for storing movies, they ended up creating the compact disc, or CD, which would transform the way we listen to music.

In a February 1979 article, the Philips engineers described how they stored bits in the form of 1-micrometer pits in a tellurium-coated disk. Later that year, Philips and Sony Corp., Tokyo, announced their plans to market the technology, and in December we wrote: "In the early 1980s, manufacturers will begin marketing systems that produce high-fidelity sound from digitally encoded audio disks." Digital encoding technology, though still a novelty, already seemed to us poised to supplant analog systems.

In 1983, we cited the high prices for CD players and discs--US $600 and $15, respectively, which in today's prices would be $1130 and $28--but nevertheless concluded that the "future of CD audio is promising." We even foresaw the possible "integration of CDs and computers." Music CD sales would soar in the coming decades; revenues in 2003 reached $28 billion. And in the 1990s, CDs did indeed replace floppy disks for data storage.

Spanning The Decades -- Power & Energy

Will There Still Be Blackouts In The Year 2000?

"There is a good chance that by the year 2000 the term blackout (societal definition) will be considered to be a term out of the Dark Ages." So opined an article we published in our July 1978 issue.

New York City on 14 August 2003

With memories of the massive August 2003 northeast United States blackout still fresh, along with lesser ones in the western United States, it is hard to believe we were so hopeful. But 26 years ago, such developments as deregulation and the Enron scandal couldn't be foreseen.

The article envisioned a world of customer-based electrical generation and storage. During peak demands, it said, many customers would be able to rely on their own power, easing demands on the grid. There will be times "when power is not available from portions of the transmission/distribution grid." But in a functional, or societal, sense, "enough of these backup sources will work so that major societal interruptions and disturbances will not occur."

Sixteen years later, in an August 1994 story, we called the 1978 article "prescient." We added that its "vision is fast turning into a reality, both in the United States and in many other parts of the world." Deregulation came quickly enough, much as a companion article, "Charting a New Course in California," depicted. But the grid's immunity to blackouts hasn't materialized.

1981 -- Consumer Electronics

Two Million Pixels, But Still Nothing To Watch

"High-definition television, with a much sharper picture than is currently available, is expected to be in retail stores before the decade is out."

Too bad the decade we were talking about was the 1980s, not the 1990s. In a July 1981 article, we badly underestimated the industry's ability to create an imbroglio of technologies and standards that would take years to disentangle.

In a 1989 reexamination, we pointed to 1993 as "the earliest possible date for the start of HDTV broadcasting in the United States" and 1995 as "a more probable date." That was reasonably close; the first satellite transmission of an all-digital, commercial HDTV broadcast was sent from Waco to Irving, Texas, on 14 December 1996. The transition to HDTV is finally under way. It may be in most homes before the decade is out--but don't take that as a prediction.

1983 -- Space

From Race To Base

"The key to cost-effective space operations by 2008 is a permanently manned space station in near-orbit." So said U.S. senator and former astronaut John H. Glenn Jr. in part of an 80-page Spectrum special report, back in September 1983.

Sure enough, our permanent footing in space did indeed materialize in the form of the International Space Station, in orbit in more or less its current form since 2000. In that 1983 report, Spectrum even published the illustration shown here of the station's main modules that bears a striking resemblance to the form they would eventually take--although we never dreamed that Russian modules would one day be part of the station.

Glenn's keenest insight was a reference to "limited budgets and resources." As it turned out, astronauts on board the US $113 billion station, now in a somewhat precarious state, currently spend more time on repairs than on science. And with NASA's shuttle program frozen, the situation has gotten even worse.

But the International Space Station can still be pivotal to space science, as we argued in October 2003. For that to happen, the station would have to become part of a larger program of exploratory space missions, rather than just a base for a few small-scale experiments of dubious value. Making it happen would take more help--and money--from participant countries, and more consistent political support. We'll get back to you 10 years from now, at Spectrum 's 50th anniversary, on that one.

1984 -- Software

The Soul Of An Old Machine

In the heady, early days of computers, even sober scientists believed that machines would become "intelligent" and eventually start to think like us. That was the promise of artificial intelligence, or AI, in the 1950s. And in the 1960s. And in the 1970s.

Spectrum, over those decades, believed the gospel of AI evangelists. Among the many articles we ran on the imminence of machine intelligence was one 20 years ago. It prophesied in June 1984 that expert systems--programs that mimic human experts' ability to make decisions--would replace air-traffic controllers by the year 2000, and doctors and scientists within as few as 50 years.

The debate on whether machines were really intelligent or just seemed intelligent was a favorite topic in conference halls and journal pages. Spectrum was no exception. "Intelligent systems will begin to make their way into the world, but few people will consider them to be really intelligent after all," wrote Robert Kahn in 1983. Kahn was one of the founders of the Internet and a former research director at the Defense Advanced Research Projects Agency, in Arlington, Va.

But the best assessment of AI turned out to be that of a June 1979 letter by reader Joseph Bates from Cornell University, in Ithaca, N.Y., who wrote: "I believe we are on the road to building artificial intelligence, but considering that we still have trouble developing correct 10-line programs, it is likely to be a long journey." Bates was right, and the journey continues for AI researchers. As for machines usurping humans, so far computers have made inroads on only a couple of fronts: telephone switchboards and grandmaster chessboards.

1985 -- Robotics

"Robots In The Home: Promises, Promises"

That was the title of a May 1985 story on home robots, accurately depicting them as "personal computers on wheels." The article debunked the idea that the home robots of the day would any time soon be able to bring in the paper, do the dishes, or serve canapés.

"Like early personal computers, present-day personal robots are limited in capacity, require extensive knowledge on the programmer's part to make them do anything much more sophisticated than play songs, and are expensive." The article concluded that practical, useful robots would be a decade or more in development. Today's most advanced home robots are Sony's AIBO, an electro-canine that can do some tricks, and Burlington, Mass.-based iRobot Corp.'s Roomba, which attempts valiantly to vacuum but won't touch the dishes in the sink.

1987 -- Transportation

Smart Cars

In October 1987, Spectrum tried to envision the car of 1997. The author, an engineer with Bendix Electronics, foresaw computer-controlled active suspensions that adapt to road and driving conditions, head-up displays that project vehicle speed and other key driver information on the windshield, radar-based collision-avoidance systems, and laser sensors that warn drivers of approaching obstacles.

Today, at last, many of those systems are available in cars, albeit high-end ones. Cadillac's Escalade, for example, has a suspension system that constantly adapts to road conditions to improve comfort and control, and its STS model projects a four-color display in the windshield that shows speed, shifter selection, warnings, and radio settings. If the past is a guide, these features will eventually trickle down to ordinary sedans.

1989 -- The Internet

But What Is It Good For?

Sometimes all you have to do is unlock the barn door--the horse will amble out, and the cart will follow. When it came to the horse that would turn into the Internet, Bob Lucky wasn't worried about where it would go--he just wanted to be sure he was along for the ride.

In September 1989, two years before any commercial activity on the Internet and four years before the graphical Web, the plucky Lucky, then a Bell Labs research director and still Spectrum 's in-house sage, wrote: "A bill pending before the United States Congress, sponsored by Senator Albert Gore Jr. (D-Tenn.), would authorize the construction of a nationwide gigabit network to connect educational and research institutions. The issue that keeps being raised is: what would a user do with a gigabit data link?"

Lucky's answer was simple. "We are not very good at predicting uses until the actual service becomes available. I am not worried; we will think of something when it happens."

1994 -- Telecommunications

Broadband-a-go-go

In a 1994 article that oddly confused gigabits and megabits, Carl Malamud nevertheless came up with a stellar prediction: "How does $30 a month for a T1 line grab you?" Malamud was founder of the visionary, though now-forgotten, Internet Multicasting Service, a packet-data equivalent of radio and television broadcasting.

A decade later, we've pretty much hit Malamud's target. Broadband rates throughout North America, Asia, and Europe are US $20 to $45 per month. And since a T1's 1.5 megabits per second lies in the middle of the range of current DSL and cable services (0.5-3.0 Mb/s), we can call that one a winner.

1995 -- The Internet

The World Wide What?

In January 1995, the World Wide Web was the 127th largest network in the world, ranked by the amount of data traffic passing through it. Two months later, it was the 13th largest. By April, it displaced the file-transfer protocol to become the biggest application on the Internet.

The four-year-old WWW took off like a rocket because of some easy-to-use graphical browser software put together by the folks who went off and founded Netscape Communications Corp. Netscape would go on to become head cheerleader and poster child for one of the greatest economic bubbles the world has ever seen.

We didn't run any articles on Netscape, even though Spectrum staffers were using a beta version of its browser in mid-1994.

By contrast, we couldn't stop talking about ISDN--integrated services digital network--with two feature articles on it in 1995. With speeds in the 56- to 144-kilobit-per-second range and costs three times higher than ordinary dial-up data services, ISDN had an understandably short moment in the spotlight.

To our credit, we did publish the results of a roundtable discussion in September 1995, in which participants talked about the Internet as we have come to know and love it.

1995 -- International

China Unchecked

China's emergence as a major power hasn't always been as obvious as it is today. In 1993, for example, China's rate of inflation was 17 percent, its trade imbalance was US $12 billion, and its budget was $15 billion in the red.

A December 1995 article, "Chip-Making in China," did a good job of looking past that. Here are three predictions that have aged well:

"China's semiconductor industry is riding high on a consumer electronics boom. Almost limitless growth seems assured."

"The number of phone lines will more than triple from the present 30 million to 100 million by the end of the decade. Roads, power plants, industrial complexes, and housing are being built as fast as possible."

"It will take China's cities 10 years and the rest of the country 20-30 to reach a standard of living on a par with that of the United States and Japan. But the importance of China to the electronics and semiconductor industries, both as a supplier and as a consumer, will be evident much sooner."

1996 -- Telecommunications

Fee, Fi, Fo, Wi-fi

In January 1996, modems were buzzing along at 33.6 kilobits per second, Qualcomm Inc. was putting the finishing touches on code-division multiple access (CDMA), and getting a personal communications system (PCS) was the hot new way to go for wireless data transmission. Spectrum was up on all that, and in addition alerted readers to the decade's most obscurely important development in wireless communications.

We wrote: "Wireless local-area networks...have languished since emerging in 1990, but several developments should contribute to a turnaround. First, the IEEE P802.11 Wireless LAN project is expected to go to the IEEE standards board for approval early this year....Next, prices will drop....[T]he 'build cost' for wireless LAN cards should drop to $75 in 1997 from about $300 in 1994."

1996 -- Telecommunications

Couch Potatoes Of The World, Unite!

The television industry has been touting interactive TV at least since the failure of the wildly expensive Columbus, Ohio, Qube system of the 1970s. By the 1990s, the industry (again) claimed to have it figured out. We didn't agree. In a pair of articles, we said that television would remain essentially a one-way channel, while interactivity would flourish on a neighboring home screen, the Web.

In making that assertion, we refused to board hype trains lavishly funded by Time Warner, Pacific Telesis, GTE, NTT, Rogers Cable, and others. We quoted experts like David Fallows of Continental Cablevision, who said: "We can't see how interactive television works in the next five years. The set-top box has too many open issues, and it costs too much. On the other hand, some 40 percent of our customers have gone out and bought $1500 set-top boxes--they just think they're home computers. By putting a little investment into cable modems, we can turn that home computer into a content-output device."

Photo Credits (Top To Bottom): Alamy (Manipulation: Michael R. Vella); Corbis; Transrapid Usa; Alamy: Frank Franklin Ii/Ap; Nasa; Getty Images; David Schleinkofer; Qilai Shen/Landov

Advertisement
Advertisement