Facebook’s business model is about trust, not about collecting data and monetizing it, and if it could just communicate that concept more clearly, everything would be better.
That’s the message Facebook’s deputy chief privacy officer, Rob Sherman, tried to get across to a crowded auditorium at Stanford University earlier this month. Sherman was speaking at a session of a Stanford Continuing Studies course titled “The Ethics of Technological Innovation.” The panel of Stanford faculty and staff members, along with representatives from the National Security Agency (NSA) and the Electronic Frontier Foundation (EFF) reviewed a wide range of privacy issues in an increasingly connected world. But the discussion kept coming back to Facebook, and whether anyone really believes the social media company is out to protect users and their privacy.
All panelists agreed that privacy is an important right. Sherman was quick to point out, however, that privacy is not a simple on-off switch. “People are on Facebook because they want to share information about themselves,” he said. “A lot of privacy is being able to choose what you are sharing and with whom, [and] knowing who has information about you and how they are using it.” Facebook’s goal is to make privacy “right for what each person wants,” he added.
“Given what we’ve seen as a series of scandals and issues,” said Hilary Cohen, program manager for Stanford’s Ethics and Technology Initiative, “why should individuals have the expectation that companies will handle data responsibly?” She was referring to the recent conflict between Apple and Facebook, in which Facebook violated Apple’s rules to collect data on app users.
“You should have that expectation in part because it’s the right thing to do, Sherman replied. “It’s also an important business incentive; people won’t trust Facebook if they don’t think information is protected.”
Rob Reich, Stanford professor of political science, questioned whether the business interests of Facebook and other tech companies are truly consistent with robust privacy protections.
“An outsider’s view about tech companies,” he said, “is that they operate on an approach of data maximalism, that is, to collect as much data as possible from users in order to discover how, from this mass of data, there might be monetization. Limiting data collection is at odds with the standard business model for making money.”
“Maybe collecting data and monetizing it later made sense when you were bootstrapping a company,” Sherman responded. “But for a large company in it for the long haul, you have to have a business model based on trust; you don’t get that by taking the approach you are talking about.
“People choose to use our services to connect with people they care about,” he continued. “We show ads to monetize our service. We use data to show more relevant ads.”
Jennifer Lynch, EFF surveillance litigation director, jumped in. “You said the business model is straightforward,” she said. “ ‘We sell ads, we give you more relevant ads.’ But it’s not even clear to me—and I do this for a living—how to turn off the tracking.”
Sherman reiterated his main talking point: “We have work to do to build people’s trust and communicate clearly,” he said.
User agreements and consent
The conversation then turned to user agreements and consent. Stanford political science professor Jeremy Weinstein polled the audience. “In the last six months have you read, beginning to end, any terms you have agreed to in downloading an app?” he asked, and about nine hands went up in the audience of more than 300 attendees. “Who has read it for all your apps?” he asked. Not a single person raised their hand.
“Individual decision-making, [facing] one ‘terms of service’ after another, might be too fragile a pillar for consent policies,” said Reich. Today, he said, “downloading the app or using the service is obviously opt in, but by having the app on your phone, many experience a default opt in to a number of data collection practices within the app.
“But when I go to the doctor or hospital, I’ve opted in to nothing else [by going there],” he said. “I haven’t consented to any treatments. It’s not assumed by a doctor or nurse that well, you’re here, you’ve agreed to maximal treatments.”
When Facebook turned on its facial recognition software, Reich recalled, it was an opt out, and “people woke up with it turned on. Facebook was criticized for that.”
“The majority of people just want to share with friends,” Sherman explained, “so that’s the default.”
To regulate, or not to regulate?
Would regulation improve privacy protection? Facebook’s Sherman indicated that the company would welcome regulation. “I don’t think regulation is required,” he said, but regulation “at the federal level, [setting] consistent obligations for people collecting data would be helpful.” He pointed to Europe’s General Data Protection Regulation (GDPR) as “a good starting point for that discussion” and indicated that Facebook CEO Mark Zuckerberg supports federal privacy legislation.
The EFF’s Lynch would go a little further. “I want the ability to sue [Facebook],” she said. “A private right of action, the right of a citizen to bring an action against a company, is an important piece of any legislation. What worries me is that we will pass a privacy law—either federal or state—that doesn’t have a right of action unless there is a breach of your data, [and] puts the burden on the attorney general to interpret the law.”
Just walk away?
The idea that anyone who is not happy with the way apps protect privacy can just stop using them was met with some skepticism on the panel. As Weinstein said, “the notion that consumers can walk away from the platform to choose privacy from another provider, when network power means that no other provider will have [a similar] set of connections” isn’t realistic.
Sherman argued that while people use Facebook to connect to others, it isn’t the only way of connecting. “There are other options,” he said, “[including] Snap, Twitter, email, and SMS; people have choices. It’s simplistic to say there are no options.”
Lynch disagreed. “The point about network effects is a real issue. What are my choices if I don’t want to use Facebook, but my aunt across the country is posting things I want to see? What are the alternatives to Facebook? Not Snapchat. Not Instagram. The way we communicate with people is controlled by a small number of platforms.”
Tekla S. Perry is a senior editor at IEEE Spectrum. Based in Palo Alto, Calif., she's been covering the people, companies, and technology that make Silicon Valley a special place for more than 40 years. An IEEE member, she holds a bachelor's degree in journalism from Michigan State University.