U.S. Defense Dollars For Computer Science Plunge
Will industry make up the difference?
David Patterson had a great idea. Two years ago, the eminent computer scientist, a professor at the University of California, Berkeley, was looking at recent advances in statistical machine learning, an area within artificial intelligence involving how computers can learn automatically. It occurred to him that the technology had enormous potential to make distributed computer systems, military as well as commercial, more stable and robust. So he contacted the logical source to fund such an idea: the Defense Advanced Research Projects Agency, or DARPA, the U.S. Department of Defense organization known for backing long-range, blue-sky research.
To his surprise, he was refused. "They didn't even respond for many months, and then it was just a perfunctory rejection," recalls Patterson, an IEEE Fellow and president of the Association for Computing Machinery, in New York City. He next tried the National Science Foundation, in Arlington, Va., but was again turned down. Talking to his Berkeley colleagues, who'd also had grant proposals declined, Patterson says, "We came to the conclusion that the style of high-risk, high-impact research we've been doing, involving 3 to 6 faculty and maybe 20 to 30 graduate students, was going to be a problem."
So Patterson went looking elsewhere for money, and in December, he announced the creation of the Reliable, Adaptive, and Distributed Systems Laboratory--or RAD Lab--jointly funded by Google, Microsoft, and Sun Microsystems, each of which will give the lab US $500 000 annually for five years [see photo, " "]. The IT industry has sponsored many university-based projects, but this may be the first to be formed out of frustration with the current funding climate in Washington. "In this era of increasing competitive pressures, people tend to get conservative, and descriptions like 'ambitious proposal' tend to be a negative," Patterson notes. "We had to find another model."
Much of the frustration among computer scientists has been aimed at DARPA. Not only has the level of support plummeted, critics say, but the money is going toward near-term projects with a strictly military focus. By the agency's own accounting, it awarded universities $207 million in 2002 for computer science research, but only $123 million in 2004. (These figures don't include grants under which a university served as a subcontractor or did classified work; factoring in those sources, university funding in computer science dropped from $215 million to $161 million.) Meanwhile, DARPA's overall budget has been steadily rising. In fiscal year 2003, the agency received $2.7 billion; last year it got just under $3 billion, and the 2006 request is $3.1 billion.
The other main government backer of university computer science research in the United States is the National Science Foundation. Funding there actually doubled between 1999 and 2005, to close to $500 million, according to Peter Freeman, assistant director for NSF's Computer and Information Science and Engineering Directorate. But computer science is growing even faster, he says. While in years past, the directorate supported 30 to 35 percent of the proposals it received, by 2004 the funding rate had been halved, to 16 percent, while in 2005 it was 21 percent.
NSF grants tend to be small--typically $150 000 or less, which is enough to support one professor over the summer, plus a couple of graduate students. That's far too little to sustain large, multiyear undertakings like Patterson's or like a project developed by computer scientists Doug Burger and Steve Keckler at the University of Texas at Austin. In 2000 the two associate professors began sketching out a new scalable architecture for high-performance microprocessors. Such devices could eventually allow a single chip to perform trillions of calculations per second and be useful in signal processors, servers, desktop computers, and embedded systems.
Burger and Keckler got some seed money from the NSF to start working out their idea, dubbed TRIPS (for Tera-op, Reliable, Intelligently adaptive Processing System). But to flesh out the concept, they needed several million dollars to build the chip sets, develop software, and test and evaluate the system. NSF "simply did not have that kind of money," Freeman says. DARPA did. In 2001, the agency began funding the project, initially to develop the proof of concept and later to build a prototype. This past November the agency awarded a $4.3 million contract, which is now supporting 30 researchers.
"It's been a struggle to keep it funded," Keckler says. The DARPA grants have to be renewed every two years, which for university researchers "is a very short horizon." While DARPA continues to invest heavily in computer science, he notes, it's clearly favoring industry labs over academia. "When universities come in as subcontractors rather than primary investigators, the research they end up doing is more confined and less creative than it might be," he says.
As long-term, federally funded projects become increasingly rare, say Patterson and others, some important problems aren't being addressed. For example, the chip industry's move to multicore processors, like Sun's Niagara chip [see "Sun's Big Splash," IEEE Spectrum, January 2005], has caught software developers flat-footed. "We really don't know how to write software in this new model," Patterson notes. "It's absolutely critical for the future of IT in the United States and around the world that we figure it out."
In hearings held by the House Science Committee last May, William Wulf, president of the National Academy of Engineering, Washington, D.C., sounded an alarm. "At a time of growing global competition, DARPA's disinvestment in university-based, long-term research is, in my view, a risky game for the country," he said. Several other witnesses, including F. Thomas Leighton, a professor of applied mathematics at the Massachusetts Institute of Technology, in Cambridge, whose company, Akamai Technologies Inc. (also in Cambridge), grew out of DARPA-sponsored research, accused the agency of abandoning a half-century tradition of basic research that spawned, most famously, the Internet. Among the areas hurt by DARPA cutbacks is cybersecurity, they said.
For his part, DARPA director Anthony Tether has flatly denied any shift toward the near-term. Tether, who declined to be interviewed for this article, told the committee that as more research begins to cross disciplines, computer science is being funded as part of such multidisciplinary efforts rather than as a stand-alone field. He referred to dozens of current DARPA programs that show the agency "is, indeed, funding radical ideas that involve long-range research."
Despite Tether's contentions, those familiar with budgetary decision making within the Pentagon say that there in fact has been a shift at DARPA, driven in large part by the United States' ongoing conflicts in Iraq and Afghanistan.
One U.S. scientist, who directs a large Defense Department research program and who asked not to be identified, says that Tether is "getting pressure from the two-star generals to come up with stuff that they can use right now, and the academics are not delivering." A subtler issue, he says, is the administration's perception that academics, generally speaking, are too liberal. "They want the money, but they don't want anyone to tell them what to do. They don't want military recruiting on campus. They don't support the politics of this administration," said the scientist.
Then there's the notion that the IT industry, rather than the federal government, should sponsor its own research. "The general feeling [in the Pentagon] is that there are lots of wealthy IT companies out there that should be funding 6.1 and 6.2 efforts"--the budgetary designations for basic and applied research--"and that neither DARPA nor even the NSF should be as involved in this as it once was," says Robert Charette, a risk management consultant based in Spotsylvania, Va.
But the NSF's Freeman says that expecting industry to step in may be wishful thinking. "Companies are rewarded in the stock market on the profits that they make this quarter," he says. "They do not get rewarded by spending money that may not lead to anything useful to them or that may take 10 years to show results." Though a few universities have always attracted some corporate funding, he adds, "when you get beyond MIT, Berkeley, Stanford, Carnegie Mellon, and a few others, there's not much industry money available."
So Patterson's lab may continue to be an anomaly. The RAD Lab will operate similarly to other privately funded projects at Berkeley: results will be reported first to sponsors at twice-a-year, three-day retreats. But the work is nonproprietary, Patterson says, adding that "like all academics, we publish like crazy in the open literature." After three years, the lab plans to review its progress to ensure things are going well. Apart from that, he expects the sponsors to take a backseat. "That's the way they wanted it. Each company believed if they were telling us what we should do, then why do it at a university?"
Patterson brings up the success of DARPA's Grand Challenge competition [see "Hard Drive," News, IEEE Spectrum, December 2005]. Sending autonomous robotic vehicles in an endurance race across the Mojave Desert, it was a "milestone in machine learning," he says. Now, "we want to use that same technology to help us manage and operate computer systems."
The good news, he says, is that "I'm even more excited about this field than I was two years ago." An avid surfer, he likens it to catching a big wave. "You'll look at this wave on the horizon, and it's starting to peak some, and so you have to decide if you start paddling. But if you pick a good wave, the wave gets bigger, and it takes you a long way," Patterson says. "This wave is definitely getting bigger."