Privacy in the Time of COVID-19

The virus now challenging the world demands that we rethink how to manage our digital privacy

2 min read

Illustration of red cross-shaped security camera.
Illustration: Dan Page

Even though I understand how it works, I consistently find Google’s ability to know how long it will take me to drive somewhere nothing less than magical. GPS signals stream location and speed data from the legion of smartphones in vehicles on the roads between my origin and my destination; it takes only a bit of math to come up with an estimate accurate to within a few minutes.

Lately, researchers have noted that this same data can be used to pinpoint serious traffic accidents minutes before any calls get placed to emergency responders—extra time within that “golden hour” vital to the saving of lives. That result points to a hidden upside to the beast Shoshana Zuboff termed surveillance capitalism. That is, all of this data about our activities being harvested by our devices could be put to work to serve the public good.

We need that now like never before, as the entire planet confronts a pandemic. Fortunately, we’ve been exceptionally clever at making smartphones—more than 4 billion of them in every nation on earth—and they offer an unprecedented opportunity to harness their distributed sensing and intelligence to provide a greater degree of safety than we might have had without them.

Taiwan got into this game early, combining the lessons of SARS with the latest in tracking and smartphone apps to deliver an integrated public health response. As of this writing, that approach has kept the country’s infection rate among the lowest in the world. The twin heads of surveillance capitalism, Google and Facebook, will spend the next year working with public health authorities to provide insights that can guide both the behavior of individuals and public policy. That’s going to give some angina to the advocates of strong privacy policies (I count myself among them), but in an emergency, public good inevitably trumps private rights.

This relaxation of privacy boundaries mustn’t mean the imposition of a surveillance state—that would only result in decreased compliance. Instead, our devices will be doing their utmost to remind us how to stay healthy, much like our smart watches already do but more pointedly and with access to far greater data. Both data and access are what we must be most careful with, looking for the sweet spot between public health and private interest, with an eye to how we can wind back to a world with greater privacies after the crisis abates.

A decade ago I quit using Facebook, because even then I had grave suspicions that my social graph could be weaponized and used against me. Yet this same capacity to know so much about people—their connections, their contacts, even their moods—means we also have a powerful tool to track the spread of outbreaks both of disease and deadly misinformation. While firms like Cambridge Analytica have used this power to sway the body politic, we haven’t used such implements yet for the public good. Now, I’d argue, we have little choice.

We technologists are going to need to get our hands dirty, and only with transparency, honesty, and probity can we do so safely. Yes, use the data, use the tools, use the analytics and the machine learning, bring it all to bear. But let us all know how, when, and why it’s being used. Because it appears that making a turn toward a surveillance society will protect us now—and, in particular, help our most vulnerable to stay safe—we need to be honest about our needs, transparent about our uses, and completely clear about our motives.

A version of this post appears in the May 2020 print issue.

The Conversation (0)