Jobless Innovation?

As the United States tries to ride high tech out of recession, does it risk innovating its workforce out of jobs?

6 min read
Jobless Innovation?

the scientific estate logo

In a crisis, Americans always ask technology to come to the rescue. Military threat? Send in the drones. Health-care scare? Declare war on cancer and race for an AIDS vaccine. Energy shortage? Harness the sun and wind.

So inevitably, when the United States experiences the highest levels of unemployment since the 1930s, and there are few signs of improvement, the call for more technology—for more R&D–driven innovation—is heard from all directions. No greater orator than President Barack Obama has asked the country to seize what he calls our "Sputnik moment"—and invest more heavily in new technologies, just as the United States did in the years following the Soviet Union’s dramatic launch of the first orbiting spacecraft in 1957. Twelve years later, Americans landed on the moon.

"The first step in winning the future is encouraging American innovation," Obama told the nation in his State of the Union address in January.

He added: "None of us can predict with certainty what the next big industry will be, or where the new jobs will come from. Thirty years ago, we couldn't know that something called the Internet would lead to an economic revolution. What we can do—what America does better than anyone—is spark the creativity and imagination of our people."

Obama’s encouragement is valuable. Creating more permanent jobs from the already massive government spending on R&D—exceeding US $100 billion a year under even the most conservative estimates—is a worthy aim. Better pharmaceuticals, keener information-technology tools, more-compelling energy alternatives and lower-cost, more durable infrastructure technologies—all these could deliver important employment gains. Yet doubts abound that the stubborn Great American Job Bust—and the gloomy employment outlook even for highly educated new university graduates—won’t be cured by Obama’s insistence that Americans "out-innovate, out-educate, and out-build the rest of the world."

The simplest explanation for the employment crisis is "jobless innovation." Exciting new technologies are coming onto the market, but they don’t appear to require many people. The new generation of Internet stars, for instance, employ far fewer people than the older tech titans. And Silicon Valley, which remains by far the most fertile innovation cluster in the world, has experienced sharp declines in employment over the past 10 years. The Bureau of Labor Statistics found in 2010 that high-tech employment in Silicon Valley (including biotechnology) fell by a staggering 19 percent over the course of the 2000s and that Silicon Valley wages also fell, nearly 14 percent.

The decline in jobs and wages occurred, all the more shockingly, alongside the revival of Silicon Valley as a global innovation center. Not only did the area spawn major new companies, such as Twitter and Facebook, but it also saw a startling series of victories by Apple in the new arenas of the iPod, then the iPhone, and most recently the iPad. And yet despite achieving stunning dominance in smartphones and tablet computers, Apple today employs 10 workers in China for every worker in the United States.

The trend toward jobless innovation in Silicon Valley provoked Andy Grove, cofounder and former CEO of Intel, to grouse in a piece for Bloomberg BusinessWeek last year that "the U.S. has become wildly inefficient at creating American tech jobs." Grove, an icon of innovation, also complained about the United States’ "misplaced faith in the power of [technology] start-ups to create U.S. jobs."

The jobless-innovation phenomenon results partly from globalization. Grove points to two related problems: Breakthroughs made by Americans in the United States are increasingly scaled up in Asia, robbing the country of expected employment benefits. More significant innovations are arising offshore, too, Grove reports, so that "scaling and innovation take place overseas."

Scholars have labeled this phenomenon "the offshoring of innovation." They point to a marked rise in R&D activities in China and India especially, but also notably in Brazil, Indonesia, Russia, South Africa, and other rapidly growing economies. Offshoring creates a significant drag on U.S. employment, scholars say. A new book,The Global Auction: The Broken Promises of Education, Jobs, and Incomes, published in December by Phillip Brown, Hugh Lauder, and David Ashton, argues that there now exists an ample pool of "cheap brainpower," outside the United States, largely in low-wage countries, with predictable effects on the U.S. labor market.

With so much R&D now occurring in India, China, and other parts of Asia (and also in Brazil, Russia, and South Africa), new graduates of science, math and engineering programs in the United States experience "career uncertainty," reports Ron Hira, a professor of public policy at Rochester Institute of Technology, in New York. In response to poorer job prospects, Hira has found an "astounding" decline in undergraduate computer-science enrollments, for instance, suggesting to him that a supply-side approach to employment—producing more graduates in computer science—may produce only more unemployed computer scientists.

Others have found similarly troubling statistics for engineering education. One recent study of engineering schools, cited by Andrew Hacker, a shrewd analyst of contemporary America, in a new essay entitled "Where Will We Find the Jobs?" reported that "from 37 percent to 66 percent of [engineering students in various universities] did not finish with a degree in the field." Hacker raises a heretical question: Will more educated people produce more innovations, or will they even better their employment prospects?

Applying more technology to cure the jobs crisis carries another paradox. Many emerging technologies destroy jobs, in a process that the Austrian economist Joseph Schumpeter famously labeled " creative destruction."

Examples of job-destroying innovations are legion. Voice-recognition software is improving so rapidly that some telephone customer services are provided by computer networks. Meter reading in some cities now occurs automatically; a Wi-Fi antenna, attached to a smart meter, broadcasts electricity usage onto the Internet. And then there is an emerging new manufacturing technology: three-dimensional printing,> modeled on computer printing, which is already being used to build prototypes. The technique carries the potential to revolutionize manufacturing by reducing both the need for factory workers and costly materials, because it uses such materials—for instance, a powdered form of titanium—far more efficiently than conventional means.

That innovations destroy jobs as well as create them challenges a major assumption behind the United States’ secular faith, calling scientists and engineers to the rescue of society. Faith in the bounty of technological innovation is central to American optimism, a bipartisan doctrine. "We are the nation that put cars in driveways and computers in offices; the nation of Edison and the Wright brothers; of Google and Facebook," Obama declared in his State of the Union address. "In America, innovation doesn't just change our lives. It’s how we make a living."

But when innovations undermine the livelihoods of Americans, fears of new technologies, and especially automation, are understandable. Such fears have periodically traumatized Americans. In the 1930s, during the Great Depression, politicians and even engineers openly worried that the fruits of their labors might render their fellow human beings irrelevant. The Technocracy movement, led by self-styled prophets of automation and propelled by what historian Edwin T. Layton Jr. termed "the revolt of the engineers," thrived on worries that mass unemployment would become permanent and that American poverty could be combated only by handing power to a technoscientific elite. In the 1950s, the advent of mainframe computers in business spawned a new anxiety—that "machinery of the mind" would replace people who did perfunctory mental tasks. TV commercials extolling the power of mainframe computers, notably a series by IBM, stoked fresh fears in the early 1960s.

This article was modified on 07/04/2011

To be sure, automation scares passed, and technology-driven job growth in the 1980s and 1990s eroded the credibility of pessimists. But ever more powerful computer networks are once more igniting paranoia. Observing the humanlike reasoning abilities of IBM’s Watson, Fortune magazine asked in February, " Will IBM’s Watson computer put your job in jeopardy?"

The unsettling scenario that jobless innovation will coincide with a wave of job-destroying innovation stands as a stark rejoinder to techno-optimists who openly espouse a belief in technology to deliver full (or greater) employment. The alternative to promoting innovation no matter what is tricky. How can Americans capture more of the employment associated with job-expanding innovations? The answers aren’t yet clear. But what is clear ought to cause a serious reexamination of the traditional equation of technological innovation with healthy markets for jobs. As Hacker concludes, "we are moving to a new employment era where old assumptions won’t apply."

One of those assumptions could well be that more innovation will mean less prosperity in the United States, even in an America that is home to the world’s most innovative companies. Startlingly, this scenario isn’t even the darkest for the best-educated Americans, because at least in this future the United States remains the world’s technological leader.

Photo of G. Pascal Zachary

About the Author

G. Pascal Zachary is a professor of practice at the Consortium for Science, Policy & Outcomes at Arizona State University. He is the author of Showstopper!: The Breakneck Pace to Create Windows NT and the Next Generation at Microsoft (1994), on the making of a Microsoft Windows program, and Endless Frontier: Vannevar Bush, Engineer of the American Century (1997), which received IEEE’s first literary award. Zachary reported on Silicon Valley for The Wall Street Journal in the 1990s; for The New York Times, he launched the Ping column on innovation in 2007. The Scientific Estate is made possible through the support of Arizona State University and IEEE Spectrum.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions