Why the Web Spreads Information and Misinformation Equally Well

Ted Nelson’s transclusion of source materials might have worked better

2 min read

Illustration of a laptop screens with snakes between them.
Illustration: Dan Page

“A lie gets halfway around the world while the truth is putting on its shoes.” That’s a great line, but who originally said it? Was it Mark Twain, always good for an epigram, or the oft-quoted Winston Churchill? According to The New York Times, it’s an adaptation of something written three centuries ago by famed satirist Jonathan Swift: “Falsehood flies, and the Truth comes limping after it.”

“Truth is the first casualty of war.” (Classical playwright Aeschylus or California statesman Hiram Johnson?) Given truth’s obvious vulnerabilities, we should be doing more to protect it when we send it out to do battle. But having constructed a technological apparatus that disseminates information instantaneously and globally without regard to its veracity, we shouldn’t be surprised that this apparatus has left us drowning in lies.

“First we shape our tools, thereafter our tools shape us.” (Marshall McLuhan or Father John Culkin?) Our copy-and-paste notions of truth and factuality likely have their roots in the early Web, which was intended initially only to link resources stored across a heterogeneous range of computers at CERN, the European Organization for Nuclear Research. The particle physicists’ tool soon helped everyone to share information about everything. But the Web’s hyperlinks inevitably create an impenetrable thicket of pointers, from one resource to another to another to another until it becomes nearly impossible to discern an ultimate source of truth. The Web created a global, hyperlinked document space, paradoxically making the truth more obscure than ever.

Two decades before the Web, hypertext pioneer Ted Nelson offered another model: Rather than just reference sources, include them. More subtle than a simple copy-and-paste operation, Nelson’s approach allows one document to embed the content of another via a link to a portion of the source document. In such a system nothing needs to be copied. The referring document “transcludes” a portion of the material found in the source document.

Transclusion allows for the creation of hypertext documents that are themselves the assembly of other hypertext documents that are themselves the assembly of still more hypertext documents. While any document can contain original content, it simultaneously serves as a window onto other documents, allowing viewers to reach into and through the references, all the way back to their primary sources. Transclusion could have created a Web built on a set of unimpeachable, universally accepted sources of information.

Served up by content-management systems that algorithmically compose documents from multiple sources, the modern Web gradually converged with Nelson’s vision for transclusion—with one key difference: The Web offers no single source of truth, nor any ultimate reference to a set of trusted sources. Instead, everything points to everything else (or to itself), which tends to make the Web appear to be fuller and more authoritative than it really is. That helps explain why conspiracy theories like QAnon are so difficult to root out.

It would only take a few subtle changes to nudge the Web away from the shifting sands of links and plant it firmly in the real world of universally accepted facts. The nature of these authoritative sources will be fought over, naturally, as fierce rivals battle it out to set the terms for defining the truth. Yet where we can build consensus, humanity would possess “a truth universally acknowledged”—to borrow a line that we can all agree belongs to Jane Austen.

This article appears in the December 2020 print issue as “The Web’s Lurking Lies.”

The Conversation (0)