As COVID-19 sweeps through the planet, a number of researchers have advocated the use of digital contact tracing to reduce the spread of the disease. The controversial technique can be effective, but can have disastrous consequences if not implemented with proper privacy checks and encryption.
Ramesh Raskar, an associate professor at MIT Media Lab, and his team have developed an app called Private Kit: Safe Paths that they say can do the job while protecting privacy. The software could get integrated into a new, official WHO app touted as the “Waze for COVID-19.” IEEE Spectrum spoke with Raskar to better understand the risks and benefits of digital contact tracing.
IEEE Spectrum: What is conventional contact tracing?
Ramesh Raskar: It’s back-tracing the steps of the patient, trying to find every individual who might have come in contact [with them] over the last two weeks or so. It’s very manual and involves interviews and phone calls.
Spectrum: Is it effective?
Raskar: As long as the patient didn’t fly or take a bus or attend a large event, you can do a reasonably good job. The best case scenario is you find people within close contact of the infected person, but you can almost never find the close contacts’ contacts.
Spectrum: Tell me about digital contact tracing using mobile phones.
Raskar: It’s a way to figure out if two people were in the same location at the same time, based on co-location tracking. The simplest scenario, and the one we’re deploying, is that everyone downloads an app with a GPS-based location logger. When a person is confirmed as having COVID-19, they donate their GPS data to the app’s server. This gives a location trail of everywhere they’ve been for the last two weeks, but without revealing the person’s identity. Everyone else who uses the app can look at those trails to compare with their own to see if there was significant overlap, but they never have to share their trails.
Spectrum: The utility of a tool like this would depend in part on how widespread disease testing is, right?
Raskar: Yes, that information is critical. You have to know who is infected. And that information has to be authentic—confirmed by a test and witnessed by a health care worker.
Spectrum: But with COVID-19, there are lots of infected people who don’t get tested because they are asymptomatic or their symptoms aren’t bad enough to require care. So what is the point of software like yours when there are so many asymptomatic people?
Raskar: Epidemics is a game of probabilities, not a game of absolutes. You don’t have to catch everyone. If you trace even a small fraction of people, that will start reducing the R0, which is the average number of people who are infected by a patient.
Spectrum: Is it too late in this COVID-19 pandemic to start doing contact tracing?
Raskar: No. Even in places that are locked down, there’s a percentage of people who have to go to work because they are essential—police officers or health care workers or grocery store employees. They need solutions like contact tracing because they can’t shut down those operations when one person gets infected. And we just heard from the U.S. government that lock downs are going to start lifting, so we will need to contact trace when people go back to work.
There’s a nice graphic from Resolve to Save Lives, which is led by a former director of the CDC, about which solutions are most effective at which stages of disease spread. Contact tracing is effective early in an epidemic, when authorities are trying to contain the virus, and also while it’s being suppressed.
Spectrum: How do you protect the privacy of app users?
Raskar: Infected individuals can blur or redact locations that might be sensitive or give away their identity. And for users who are not infected, all the calculations regarding their location trail happen on the smartphone. It never goes to the server. So the only person who knows that they might have crossed paths with an infected person is the user himself or herself. This is very important. For more complex operations, the user can upload an encrypted version of their GPS or Bluetooth trails onto a server.
Spectrum: Who controls the server and what stops those people or malicious outsiders from hacking into the data and invading the privacy of app users?
Raskar: This is an MIT open-source, no-revenue, no-ads tool. We wanted to build a trusted, impartial, honest broker that can solve problems. So we invented encryption methods that allow us to achieve both utility and privacy. It’s called split learning. It was intended for other types of business but about a month ago we started working on it for COVID-19.
Spectrum: So even the people who built the system cannot access the data uploaded by users.
Raskar: That’s right. It comes in a strange format that doesn’t allow anyone to retrace or reconstruct the data. In efforts like this where we aggregate powerful data, we have to avoid the temptation to create a big brother who can see everything.
Spectrum: China, South Korea, and Singapore have used digital contact tracing to combat COVID-19, and the ramifications on privacy and people’s lives have been appalling. How does this happen?
Raskar: Some governments have access to location trails of both the user and the patient, creating a surveillance state. That means the state knows exactly which user to go after and will hunt them down, and that becomes a problem. And some countries publicly released unredacted, raw GPS trails of the infected person, leading to public shaming of the infected person.
Spectrum: Tell me some stories about the kinds of privacy intrusions that are happening.
Raskar: In South Korea, vigilante groups started forming on Facebook and social media around the data from contact tracing. They became armchair detectives. They would piece together information about a person in their neighborhood and gossip and shame people or discover parts of their life that are extremely private.
In China you get a red, yellow, or green code from an app on your phone based on whether you might have been in contact with an infected person or not. People with red on their phone started getting shamed. And then at some point, only the people who had green could use government services or go to the grocery store. Citizens lost a sense of agency. And vigilantes living in tall apartment complexes would see a resident sneezing or coughing and not let that person in the building—they would stop them from going to their home. They knew that if that person was eventually diagnosed with the virus, everyone in that apartment complex would get a red. This was in a country where there’s a homogenous population. Can you imagine the discrimination and racism and bullying that could explode in a country with a heterogeneous population? We might end up pitting neighbor against neighbor and causing civic unrest.
Spectrum: What’s happening to local businesses that get caught up in this?
Raskar: Say an infected person goes to a small noodle shop and people see that trail. No one wants to go to the noodle shop now and it goes out of business, even though the infected person was only there for an hour.
Businesses are also subjected to blackmail. Since some contact tracing apps allow people to self-report their symptoms, bad actors will go to a shop and threaten to report symptoms from that location unless they are given a ransom. There are a lot of those stories from China and South Korea. And the malicious actors don’t even have to physically go to the shop; they can do it remotely sitting on a computer since GPS spoofing and Bluetooth spoofing is pretty straightforward.
These are problems not only now, but in the future too. The data could persist, and any breach of that data can lead to a lot of repercussions, not only for individual privacy but also national security. The social graph of a city or community or country is basically a national secret. Nation states are always attacking each other behind the scenes in moments of weakness like this.
Spectrum: So knowing all of this, how do you feel about the fact that you are developing a digital contact tracing app?
Raskar: We have a table in a recent white paper on Safe Paths [PDF] that shows how contact tracing impacts the patient, the user, business, and non-users. And if you look at the table, it’s a little depressing. There is no perfect solution. So you have to design something that’s appropriate for the values in the society. We came up with a few principles for Safe Paths. The first is: Whose privacy matters? We decided the non-infected user’s privacy gets the most extreme protection. Next is the infected person’s privacy, but there will still be some leakage of their information. The privacy of businesses is a lower priority. The second principle is that you should put as many calculations as possible on the smartphone, and not on the server, so that the government or big companies can’t see it. If the calculations cannot be done on the phone, only encrypted information should be shared with the server. The third principle is that everything is open source so people can check on the ethics of the software. Fourth, the data should only be used for the purpose it was collected. And fifth, there is a tradeoff between accuracy of information and the impact on an individual. In other words, we think that it’s okay to blur an infected person’s location—say by a kilometer or so—to protect their privacy.
This story was updated on 1 April 2020.
Emily Waltz is a contributing editor at Spectrum covering the intersection of technology and the human body. Her favorite topics include electrical stimulation of the nervous system, wearable sensors, and tiny medical robots that dive deep into the human body. She has been writing for Spectrum since 2012, and for the Nature journals since 2005. Emily has a master's degree from Columbia University Graduate School of Journalism and an undergraduate degree from Vanderbilt University. She aims to say something true and useful in every story she writes. Contact her via @EmWaltz on Twitter or through her website.