The tech industry can’t hide from the information war, particularly when its own creations are being weaponized.
That was the consensus of a panel at the Techonomy17 conference in Half Moon Bay, Calif., last week. The group assembled to discuss the meaning of authority in a networked, artificially intelligent world. The panelists quickly zoomed in on the manipulation of Facebook, Google, and other sites by Russians during the U.S. presidential election. They, as well as several other speakers at the conference, painted a dark picture of our current online world for at least the immediate future; they concluded that preventing such manipulation is not going to be easy.
“I spent my whole life working in civil liberties, and I didn’t see this coming,” said Marc Rotenberg, president of the Electronic Privacy Information Center. “Democratic institutions, the rule of law...are up for grabs. Anything can happen.”
“We always knew we were going into an information war next,” said Danah Boyd, a principal researcher at Microsoft and founder of the Data & Society Research Institute, “and that we would never know immediately that that is what it is. Now we can’t tell what reality is.”
The tech companies should have seen this coming, panelists maintained.
Said Roger McNamee, co-founder and managing director of venture capital firm Elevation Partners: “These companies began with a goal of connecting the world. But once they put a business model in place that depended on advertising, they had to put in techniques that depended on creating addiction. Then when the smart phone came along, it created an opportunity to create a level of brain hacking that had never previously been seen.”
Speaker after speaker marveled that, at the previous year’s Techonomy conference, Facebook’s Mark Zuckerberg told the attendees “the idea that fake news, of which it’s a very small amount of the content, influenced the election in any way is a pretty crazy idea.”
“I believe Mark was sincere when he said that,” McNamee said. (Now that the extensive use of Facebook by Russians intent on manipulating the election has come to light, just what Facebook execs knew and when they knew it has come under scrutiny, most recently in Senate judiciary subcommittee hearings.)
Panelists—and others I spoke to at the conference—seemed shocked by how cheap and easy it turned out to be to weaponize social networks.
“The first $100,000 spent on Facebook [by the Russians] reached as many people as voted in the election,” Stratford Sherman, co-founder of Accompli, pointed out.
McNamee explained that all the ads had to do was get people to join groups, by playing on their emotions—mostly fear and anger. Once they were into the groups, they could be targeted for free.
“We hardened our financial institutions against hacking,” McNamee said, “but it never occurred to us that the minds of our voters could be hacked; they turned our tools against us.”
While the industry leaders can make the case that they didn’t see this coming, they can no longer pretend it hasn’t happened—or deny that they have to do everything they can to fix it, the speakers indicated.
“This is an ‘Oh crap!’ moment for tech companies,” said Microsoft’s Boyd.
But what, exactly, can be done? Nobody is really sure what will work, but speakers at the Techonomy conference had a few ideas.
J. Galen Buckwalter, best known as the brains behind eHarmony’s patented algorithm for matching singles profiles on its dating site, suggested that AI could potentially revamp social media “into an antidote for authoritarian thinking.”
However, Boyd expressed concern that AI could be subverted as well. She noted that groups of social media hackers that use the dark web to share tools and strategies have started to address AI. Among the thought experiments they’re now toying with is how to affect “core data sources to mess with the natural language processing systems.”
In response, she argued, developers need “to think about technical antibodies.”
For starters, she argued, companies need to get back to testing things. “Social media obliterated the idea that we should have to test,” she said. Instead, the social media companies “would throw things out into the wild, see how people would mess with them, and hope we could work fixes back in. That won’t work going forward.”
The social networks like Facebook, Google, and Twitter also need to embrace transparency, speakers agreed.
“We need to defend the rights of individuals to speak online as they would offline,” said Rotenberg, but “when a company sells advertising it should be more transparent, and it should be regulated.”
The tech community, he continued “has to get away from the idea of self-regulation. Every technology—the train, the car—required regulation. The current model [of self-regulation] is collapsing.”
Andrew Anagnost, the new CEO of Autodesk, spoke at a later session, and agreed with their assessment. “We have two Silicon Valley companies [Google and Facebook]—they aren’t tech companies, they are media companies; we are not their customers, we are their product. And they aren’t even regulated to the point that television was regulated when we were all calling it the idiot tube.”
It’s also time to start deprogramming the people who were targeted by Russian propaganda, McNamee said. “The only way to reprogram the Russian thing is for Facebook to send a message to everyone who was manipulated, stating that they were manipulated and showing them every item that touched them that came from the Russians.”
“We have passed the fail-safe point,” McNamee said. “I don’t think we can get back to the Silicon Valley that I loved. At this point we just have to save America.”
Tekla S. Perry is a senior editor at IEEE Spectrum. Based in Palo Alto, Calif., she's been covering the people, companies, and technology that make Silicon Valley a special place for more than 40 years. An IEEE member, she holds a bachelor's degree in journalism from Michigan State University.