Does the NSA Really Need “Direct Access”?

What to call U.S. intelligence agencies’ activities comes down to semantics

3 min read
Does the NSA Really Need “Direct Access”?

Protesting the Program: Activists gathered in Washington D.C. on June 14th to rally against U.S government surveillance programs.

We’re now well into the second stage of the controversy surrounding the allegations that the NSA is conducting large-scale surveillance of U.S. citizens. Whistleblower/leaker/traitor (the exact term varying according to individual opinion.) Edward Snowden is being scrutinized, as are the articles written by Glenn Greenwald for The Guardian newspaper.

That Snowden’s perceived reliability, or lack thereof, has become a major part of the story is an entirely predictable consequence of his decision to reveal his identity. Back in 2004, Dina Rasor, then working under the auspices of the National Whistleblower Center in Washington D.C., told IEEE Spectrum that going public in this way was like “setting your hair on fire for one glorious minute.” Whistleblowers were well advised to remain anonymous so that the revelation “becomes the issue, and not you.” (As has been pointed out in several places, if we’d known that Deep Throat was an FBI director angry at being passed over for promotion, his accusations about Watergate might not have been taken so seriously.)

That the focus of the discussion has also shifted to Greenwald’s reporting is also not surprising in the light of that 2004 article. IEEE Fellow Stephen H. Unger, a former chairman of the IEEE Ethics Committee cautioned against the dangers of hastiness, or making the slightest factual error, when bringing any revelations to light: “Don't exaggerate at all… You could be 99 percent right, but if you make one little mistake, they'll focus on that to discredit you.”

The biggest substantive criticisms of Greenwald’s reporting so far have centered on his contention that companies like Google and Apple provided “direct access,” so that the NSA could come in and snoop around however they liked, grabbing information in real time if need be.

But does the NSA really need to access to Google’s internal servers to run a system like PRISM? The U.S. intelligence community certainly has the technical ability to conduct significant eavesdropping programs on other nations’ communications systems. As for its domestic capabilities, back in 2006, another whistleblower, Mark Klein, alleged that the NSA had placed a room full of equipment (pdf) in a San Francisco AT&T facility for the express purpose of tapping Internet fiber-optic backbone traffic. In response, the Electronic Frontier Foundation filed a class-action lawsuit, which was ultimately dismissed because the U.S. Congress gave immunity to telecom companies cooperating with eavesdropping programs.

So, making the assumption that the NSA can eavesdrop on our Internet traffic already, does it really need access to Apple and Google’s server farms? After all, there’s nothing irreproducible about their systems—the rise of cloud computing technologies in recent years means that these companies’ servers are virtual constructs in any case, running on fungible hardware. With enough storage space and computing power, it is certainly technically possible to imagine shadow servers, emulating the relevant functions of a number of companies’ online services, and synchronized with data from Internet backbone taps at telcos. It might not be a perfect copy of what’s on the real servers, but such a system would still allow extensive historical searches in many cases.  With such a system, “direct access” versus “intercepting traffic in transit” becomes a distinction without a difference.

True, many cloud-based services, such as Gmail, do provide end-to-end encryption, but many inter-service communications are not encrypted. And if Chinese hackers have been able to penetrate, to at least some degree, U.S. companies, what chance would these firms really have against a determined U.S. intelligence agency on its own soil?

Whether or not such a scenario actually reflects reality may then be quite possibly more a question of legal frameworks and restrictions than technical issues.

Photo: Win McNamee/Getty Images

The Conversation (0)

The Future of Deep Learning Is Photonic

Computing with light could slash the energy needs of neural networks

10 min read

This computer rendering depicts the pattern on a photonic chip that the author and his colleagues have devised for performing neural-network calculations using light.

Alexander Sludds

Think of the many tasks to which computers are being applied that in the not-so-distant past required human intuition. Computers routinely identify objects in images, transcribe speech, translate between languages, diagnose medical conditions, play complex games, and drive cars.

The technique that has empowered these stunning developments is called deep learning, a term that refers to mathematical models known as artificial neural networks. Deep learning is a subfield of machine learning, a branch of computer science based on fitting complex models to data.

Keep Reading ↓ Show less