Tuesday the Court of Justice of the European Union ruled that its member states can force Internet search engines to remove links to material deemed to invade the privacy of European citizens. The court argued for balancing privacy against the public's interest, and the ruling is in line with pending European legislation that seeks to establish an explicit right to be forgotten, or at least to make it harder to find unflattering personal information.
In Tuesday's case, which sets a precedent for around two hundred pending cases in Spain, Google will have to remove search results linking to 14-year-old newspaper notices about the plaintiff's home repossession. The newspaper itself is not required to remove the notices. Instead, the Spanish Data Protection Agency (AEPD) appears to have found a pragmatic choke point for helping individuals cultivate a more favorable online presence without resorting to outright censorship of published material.
Certain non-governmental organizations and journalists may look at this as a sneaky workaround, but software engineers have been trying to find ways for users to share information on a temporary basis since at least the birth of email recall requests.
At a 2009 IEEE conference, Urs Hengartner of the University of Waterloo in Canada announced an experimental browser plug-in called FaceCloak, which would allow Facebook users to use an extra layer of encryption to select which friends could see their information, independently of their Facebook settings. A 2012 European Union Agency for Network and Information Security report identified a handful of other software attempts (here, here, and here) to tackle the same problem in various settings.
The problem with all such technical methods of killing off online data is that once any data is shared, even authorized users may have other ideas about what to do with it. As the report notes, a disgruntled user could take a snapshot with a smartphone of sensitive data on a computer screen, or memorize it. An incompetent user might just lose a laptop or old hard drive bearing now-sensitive data, exposing it to unauthorized users.
The communications director of the Open Rights Group in London, England, Pam Cowburn, says courts, not companies, should make decisions about when something is private or public. Other groups issued similar statements. The Electronic Frontier Foundation told the Wall Street Journal that the ruling was "very disappointing" and "very vague."
Yet Manuel Moreno, director of the Spanish firm Bórrame ("erase me"), says that "technology which protects [personal] information will probably benefit," from the ruling. Bórrame provides consulting services to individuals who want the Internet to forget something about them. Moreno says that he guides individuals through the process of placing a claim with the AEPD and can help remove links to undesired online memories. "It's going to be much easier from now on," he says.
He also predicts that any company that collects "big data" including personal data will have to rethink its strategy following this ruling. "There's a lot more info that could be exploited and sold" by such companies, he says. Privacy complaints arising from that will provide him and other privacy consultants with plenty of work.
For a rollicking read on the old-fashioned way of burying an undesirable Internet presence, check out Open The Magazine's feature "The Man Who Makes People Disappear."