A panel of tech executives discussed privacy, encryption, and digital advertising this week at CES 2020. Apple’s senior director of global privacy Jane Horvath came out in strong favor of privacy protection, commissioner Rebecca Slaughter of the U.S. Federal Trade Commission came out even stronger. Facebook’s vice president of public policy Erin Egan maintained that Facebook does just fine in protecting consumer privacy, while Wing Venture Capital partner Rajeev Chand served as moderator and posed timely questions to the group.
(Procter & Gamble’s global privacy officer Susan Shook occasionally got a word in, but the spotlight focused on companies that provide technology far more than those that use it.)
Here’s a sampling of the discussion:
Rajeev Chand, Wing Venture Capital:
“Given the high number of high-profile security and privacy incidents last year, do you believe the consumer tech industry is doing enough to address privacy?”
Rebecca Slaughter, FTC:
“Given every day we read about breaches, it would be impossible to conclude that enough is being done.”
Jane Horvath, Apple:
“I don’t think we can ever say we are doing enough. We always have to be pushing the envelope, and figure out how to put the consumer in control of their data.
“We at Apple focus on privacy by design. We have a team of privacy lawyers and a team of privacy engineers, and for every new product, at beginning design phases, we have a privacy engineer and lawyer assigned to work with the team. We have the support of our executives—[Apple CEO] Tim Cook is committed to privacy.”[shortcode ieee-pullquote quote=""Terrorism and child sex trafficking are abhorrent, none of us want that kind of material on our platforms, but not having encryption isn't going to solve those issues."" float="left" expand=1]
Erin Egan, Facebook:
“Everything that she said about Apple holds for Facebook. But the question is what do people expect; how do people understand how their data is being collected and used.”
“To push back a little bit, I’m concerned about a universe where the entirety of the burden to protect one’s data lies with the consumer. Though a consumer can walk through a privacy checkup, the amount of information you have to process to figure out what is being done with your data across different services is untenable.
“The burden shouldn’t just be on the consumer, but on collectors to have responsibility of minimizing what is collected, what is retained, what is shared, rather than collecting this endless trove of data.”
“We use data minimization. [Other important tools include] differential privacy, [that is], noise added to the data set, so we can’t tell if it’s your data or not. And [we use a lot of] on-device processing, we build models on servers and send them down to the phones instead of sending data up to the cloud.”
“Would you argue that tech companies are using too much personal data?”
“I am focused on what Apple is doing.”
“We have a different business model than Apple, but we are focused on privacy in our advertising business model. It doesn’t always make sense to [process data] on the device—if you come to Facebook, you want to share.
“I take issue with the idea that the advertising we serve involves surveilling people. We don’t do surveillance capitalism, that by definition is surreptitious; we work hard to be transparent.”
“Should Facebook and other services use less personal data?”
Susan Shook, P&G:
“We collect the data to serve people. For the P&G consumer, trust is paramount. Unlike for the tech companies, a consumer for us has easy switching costs, she can easily shift to another product, say, from Pampers to Huggies, or get out of our equation forever.”
“Enabling consumers to vote with their feet is a clear principal of competition. But in the data area, consumers don’t always know [what is happening]—a lot of what happens in data sharing goes on in an opaque, behind-the-scenes, infrastructure. [And] deidentification is only meaningful if data can’t be reidentified; I don’t think we have reasons to be confident that with most data collection, reidentification isn’t possible. There are opportunities for innovation on the bad side for use and abuse of data.”
“Do you think privacy is protected today?”
“On Facebook, today, yes.”
“As a general matter, I don’t think privacy is protected. I don’t think anybody in this room can tell accurately who has what data about them and how it is being used. I think we would all answer such a question wrong unless we said that everybody has everything.
“[Thinking about] advertising, we don’t know what are the benefits of differential advertising and what are the [privacy] costs to that, and are those two things in appropriate balance. I am concerned we are asking those questions a little too late.”
“Is there a balance in the digital ad ecosystem?”
“I think we can have a more appropriate balance than we have today. Under today’s business models, the weight of that balance falls to, to the detriment of, consumers, and often without [consumers’] knowledge.”
“A consumer should decide what data to give to get the benefits he or she wants to get.”[shortcode ieee-pullquote quote=""I don't think anybody in this room can tell accurately who has what data about them and how it is being used."" float="left" expand=1]
Chand (turning to Egan):
“Can you see a viewpoint that there is an imbalance to the harm of consumers?”
“At Facebook we provide real value to people in terms of the advertising we deliver and we do it in a privacy protected way.”
“[Considering] end-to-end encryption. What is the right approach to abusive content?”
“End-to-end encryption is important to the services we rely on. We know we want to put health data, payment data—very sensitive data—on our devices. We also know our phones are small and they get lost and stolen. So if we are going to rely on having health and finance data on our devices, [we] have to make sure that if you misplace your device, you are not losing sensitive data.
“Terrorism and child sex trafficking are abhorrent, none of us want that kind of material on our platforms, but not having encryption isn’t going to solve those issues.”
“Should content uploaded to Dropbox and other cloud services be screened for [problematic] content?”
“We have started using some technologies to screen for child sexual abuse materials.”
“At WhatsApp, we believe in encryption in messaging services. For abusive content… we are working on reporting mechanisms, looking for signals even in encrypted content.”
“I don’t have any pushback on this one. While I am sensitive to the desire for a backdoor for law enforcement, you can’t create a backdoor for good guys that doesn’t create one for bad guys.”
This post was updated on 13 January 2020.
Tekla S. Perry is a senior editor at IEEE Spectrum. Based in Palo Alto, Calif., she's been covering the people, companies, and technology that make Silicon Valley a special place for more than 40 years. An IEEE member, she holds a bachelor's degree in journalism from Michigan State University.