Why the Web Spreads Information and Misinformation Equally Well

Ted Nelson’s transclusion of source materials might have worked better

2 min read
Illustration of a laptop screens with snakes between them.
Illustration: Dan Page

“A lie gets halfway around the world while the truth is putting on its shoes.” That’s a great line, but who originally said it? Was it Mark Twain, always good for an epigram, or the oft-quoted Winston Churchill? According to The New York Times, it’s an adaptation of something written three centuries ago by famed satirist Jonathan Swift: “Falsehood flies, and the Truth comes limping after it.”

“Truth is the first casualty of war.” (Classical playwright Aeschylus or California statesman Hiram Johnson?) Given truth’s obvious vulnerabilities, we should be doing more to protect it when we send it out to do battle. But having constructed a technological apparatus that disseminates information instantaneously and globally without regard to its veracity, we shouldn’t be surprised that this apparatus has left us drowning in lies.

“First we shape our tools, thereafter our tools shape us.” (Marshall McLuhan or Father John Culkin?) Our copy-and-paste notions of truth and factuality likely have their roots in the early Web, which was intended initially only to link resources stored across a heterogeneous range of computers at CERN, the European Organization for Nuclear Research. The particle physicists’ tool soon helped everyone to share information about everything. But the Web’s hyperlinks inevitably create an impenetrable thicket of pointers, from one resource to another to another to another until it becomes nearly impossible to discern an ultimate source of truth. The Web created a global, hyperlinked document space, paradoxically making the truth more obscure than ever.

Two decades before the Web, hypertext pioneer Ted Nelson offered another model: Rather than just reference sources, include them. More subtle than a simple copy-and-paste operation, Nelson’s approach allows one document to embed the content of another via a link to a portion of the source document. In such a system nothing needs to be copied. The referring document “transcludes” a portion of the material found in the source document.

Transclusion allows for the creation of hypertext documents that are themselves the assembly of other hypertext documents that are themselves the assembly of still more hypertext documents. While any document can contain original content, it simultaneously serves as a window onto other documents, allowing viewers to reach into and through the references, all the way back to their primary sources. Transclusion could have created a Web built on a set of unimpeachable, universally accepted sources of information.

Served up by content-management systems that algorithmically compose documents from multiple sources, the modern Web gradually converged with Nelson’s vision for transclusion—with one key difference: The Web offers no single source of truth, nor any ultimate reference to a set of trusted sources. Instead, everything points to everything else (or to itself), which tends to make the Web appear to be fuller and more authoritative than it really is. That helps explain why conspiracy theories like QAnon are so difficult to root out.

It would only take a few subtle changes to nudge the Web away from the shifting sands of links and plant it firmly in the real world of universally accepted facts. The nature of these authoritative sources will be fought over, naturally, as fierce rivals battle it out to set the terms for defining the truth. Yet where we can build consensus, humanity would possess “a truth universally acknowledged”—to borrow a line that we can all agree belongs to Jane Austen.

This article appears in the December 2020 print issue as “The Web’s Lurking Lies.”

The Conversation (0)

An IBM Quantum Computer Will Soon Pass the 1,000-Qubit Mark

The Condor processor is just one quantum-computing advance slated for 2023

4 min read
This photo shows a woman working on a piece of apparatus that is suspended from the ceiling of the laboratory.

A researcher at IBM’s Thomas J. Watson Research Center examines some of the quantum hardware being constructed there.

Connie Zhou/IBM

IBM’s Condor, the world’s first universal quantum computer with more than 1,000 qubits, is set to debut in 2023. The year is also expected to see IBM launch Heron, the first of a new flock of modular quantum processors that the company says may help it produce quantum computers with more than 4,000 qubits by 2025.

This article is part of our special report Top Tech 2023.

While quantum computers can, in theory, quickly find answers to problems that classical computers would take eons to solve, today’s quantum hardware is still short on qubits, limiting its usefulness. Entanglement and other quantum states necessary for quantum computation are infamously fragile, being susceptible to heat and other disturbances, which makes scaling up the number of qubits a huge technical challenge.

Keep Reading ↓Show less