A Bold New Plan for Preserving Online Privacy and Security

Decoupling our identities from our data and actions could safeguard our secrets

16 min read
An illustration of a person and a person shaped icon on a jar.
Francesco Muzzi/StoryTK
Red

Whether we like it or not, we all use the cloud to communicate and to store and process our data. We use dozens of cloud services, sometimes indirectly and unwittingly. We do so because the cloud brings real benefits to individuals and organizations alike. We can access our data across multiple devices, communicate with anyone from anywhere, and command a remote data center’s worth of power from a handheld device.

But using the cloud means our security and privacy now depend on cloud providers. Remember: The cloud is just another way of saying “someone else’s computer.” Cloud providers are single points of failure and prime targets for hackers to scoop up everything from proprietary corporate communications to our personal photo albums and financial documents.

The risks we face from the cloud today are not an accident. For Google to show you your work emails, it has to store many copies across many servers. Even if they’re stored in encrypted form, Google must decrypt them to display your inbox on a webpage. When Zoom coordinates a call, its servers receive and then retransmit the video and audio of all the participants, learning who’s talking and what’s said. For Apple to analyze and share your photo album, it must be able to access your photos.

Hacks of cloud services happen so often that it’s hard to keep up. Breaches can be so large as to affect nearly every person in the country, as in the Equifax breach of 2017, or a large fraction of the Fortune 500 and the U.S. government, as in the SolarWinds breach of 2019–20.

It’s not just attackers we have to worry about. Some companies use their access—benefiting from weak laws, complex software, and lax oversight—to mine and sell our data. Other companies sell us fancy but ineffective security technologies. Every company needs an attentive chief information security officer and has to pay through the nose for cybersecurity insurance. Individuals have to keep track of data breaches and privacy policy changes from their cloud providers.

Yet this vigilance does little to protect us. Just this year, Microsoft faced a firestorm for major, long-running hacks of its cloud services, and Zoom faced a backlash about its quiet policy changes regarding the use of private user data for AI. No major remedies seem likely.

We’re all hoping that companies will keep us safe, but it’s increasingly clear that they don’t, can’t, and won’t. We should stop expecting them to.

Our message is simple: It is possible to get the best of both worlds. We can and should get the benefits of the cloud while taking security back into our own hands. Here we outline a strategy for doing that.

What is decoupling?

In the last few years, a slew of ideas old and new have converged to reveal a path out of this morass, but they haven’t been widely recognized, combined, or used. These ideas, which we’ll refer to in the aggregate as “decoupling,” allow us to rethink both security and privacy.

Here’s the gist. The less someone knows, the less they can put you and your data at risk. In security this is called Least Privilege. The decoupling principle applies that idea to cloud services by making sure systems know as little as possible while doing their jobs. It states that we gain security and privacy by separating private data that today is unnecessarily concentrated.

To unpack that a bit, consider the three primary modes for working with our data as we use cloud services: data in motion, data at rest, and data in use. We should decouple them all.

Our data is in motion as we exchange traffic with cloud services such as videoconferencing servers, remote file-storage systems, and other content-delivery networks. Our data at rest, while sometimes on individual devices, is usually stored or backed up in the cloud, governed by cloud provider services and policies. And many services use the cloud to do extensive processing on our data, sometimes without our consent or knowledge. Most services involve more than one of these modes.

“We’re all hoping that companies will keep us safe, but it’s increasingly clear that they don’t, can’t, and won’t. We should stop expecting them to.”

To ensure that cloud services do not learn more than they should, and that a breach of one does not pose a fundamental threat to our data, we need two types of decoupling. The first is organizational decoupling: dividing private information among organizations such that none knows the totality of what is going on. The second is functional decoupling: splitting information among layers of software. Identifiers used to authenticate users, for example, should be kept separate from identifiers used to connect their devices to the network.

In designing decoupled systems, cloud providers should be considered potential threats, whether due to malice, negligence, or greed. To verify that decoupling has been done right, we can learn from how we think about encryption: You’ve encrypted properly if you’re comfortable sending your message with your adversary’s communications system. Similarly, you’ve decoupled properly if you’re comfortable using cloud services that have been split across a noncolluding group of adversaries.

Cryptographer David Chaum first applied the decoupling approach in security protocols for anonymity and digital cash in the 1980s, long before the advent of online banking or cryptocurrencies. Chaum asked: How can a bank or a network service provider provide a service to its users without spying on them while doing so?

Chaum’s ideas included sending Internet traffic through multiple servers run by different organizations and divvying up the data so that a breach of any one node reveals minimal information about users or usage. Although these ideas have been influential, they have found only niche uses, such as in the popular Tor browser.

Trust, but Don’t Identify

The decoupling principle can protect the privacy of data in motion, such as financial transactions and Web browsing patterns that currently are wide open to vendors, banks, websites, and Internet Service Providers (ISPs).

Illustration of a process.

StoryTK

1. Barath orders Bruce’s audiobook from Audible. 2. His bank does not know what he is buying, but it guarantees the payment. 3. A third party decrypts the order details but does not know who placed the order. 4. Audible delivers the audiobook and receives the payment.

DECOUPLED E-COMMERCE: By inserting an independent verifier between the bank and the seller and by blinding the buyer’s identity from the verifier, the seller and the verifier cannot identify the buyer, and the bank cannot identify the product purchased. But all parties can trust that the signed payment is valid.

Illustration of a process

StoryTK

1. Bruce’s browser sends a doubly encrypted request for the IP address of sigcomm.org. 2. A third-party proxy server decrypts one layer and passes on the request, replacing Bruce’s identity with an anonymous ID. 3. An Oblivious DNS server decrypts the request, looks up the IP address, and sends it back in an encrypted reply. 4. The proxy server forwards the encrypted reply to Bruce’s browser. 5. Bruce’s browser decrypts the response to obtain the IP address of sigcomm.org.

DECOUPLED WEB BROWSING: Currently Internet service or virtual private network providers can track which websites their users visit because requests to the Domain Name System (DNS), which converts domain names to IP addresses, are unencrypted. A new protocol called Oblivious DNS can protect users’ browsing requests from third parties. Each name-resolution request is encrypted twice and then sent to an intermediary (a “proxy”) that strips out the user’s IP address and decrypts the outer layer before passing the request to a domain name server, which then decrypts the actual request. Neither the ISP nor any other computer along the way can see what name is being queried. The Oblivious resolver has the key needed to decrypt the request but no information about who placed it. The resolver encrypts its reply so that only the user can read it.

Similar methods have been extended beyond DNS to multiparty-relay protocols that protect the privacy of all Web browsing through free services such as Tor and subscription services such as INVISV Relay and Apple’s iCloud Private Relay.

How decoupling can protect data in motion

Three classes of new technology developed in the last few years now make decoupling practical in many more applications.

Imagine you’re on a Zoom call. Your device and those of your colleagues are sending video to Zoom’s servers. By default, this is encrypted when sent to Zoom, but Zoom can decrypt it. That means Zoom’s servers see the video and hear the audio, and then forward it to others on the call. Zoom also knows who’s talking to whom, and when.

Meetings that were once held in a private conference room are now happening in the cloud, and third parties like Zoom see it all: who, what, when, where. There’s no reason a videoconferencing company has to learn such sensitive information about every organization it provides services to. But that’s the way it works today, and we’ve all become used to it.

There are multiple threats to the security of that Zoom call. A Zoom employee could go rogue and snoop on calls. Zoom could spy on calls of other companies or harvest and sell user data to data brokers. It could use your personal data to train its AI models. And even if Zoom and all its employees are completely trustworthy, the risk of Zoom getting breached is omnipresent. Whatever Zoom can do with your data in motion, a hacker can do to that same data in a breach. Decoupling data in motion could address those threats.

Videoconferencing doesn’t need access to unencrypted video to push bits between your device and others. A properly decoupled video service could secure the who, what, where, and when of your data in motion, beginning with the “what”—the raw content of the call. True end-to-end encryption of video and audio would keep that content private to authorized participants in a call and nobody else. (Zoom does currently offer this option, but using it disables many other features.)

To protect the “who,” functional decoupling within the service could authenticate users using cryptographic schemes that mask their identity, such as blind signatures, which Chaum invented decades ago for anonymizing purchases.

Organizational decoupling can protect the “where” and “when,” preventing the service from learning the network addresses of the participants and thus their locations and identities through different means. Newer multihop relay systems, more efficient than Tor, route data through third-party infrastructure so that when it reaches the video service, the true source is unknown.

Taken together, these decoupling measures would protect users from both Zoom’s deliberate actions and its security failures.

How decoupling can protect data storage

Data at rest, unencrypted on a laptop or phone, poses obvious risks from thieves and malware. Cloud storage is convenient, fast, and reliable, but those benefits come with new risks. A breach that affects any customer could affect all of them, making it all the more lucrative for a hacker to try to break in.

Most storage and database providers started encrypting data on disk years ago, but that’s not enough to ensure security. In most cases, the data is decrypted every time it is read from disk. A hacker or malicious insider silently snooping at the cloud provider could thus intercept your data despite it having been encrypted.

Cloud-storage companies have at various times harvested user data for AI training or to sell targeted ads. Some hoard it and offer paid access back to us or just sell it wholesale to data brokers. Even the best corporate stewards of our data are getting into the advertising game, and the decade-old feudal model of security—where a single company provides users with hardware, software, and a variety of local and cloud services—is breaking down.

Decoupling can help us retain the benefits of cloud storage while keeping our data secure. As with data in motion, the risks begin with access the provider has to raw data (or that hackers gain in a breach). End-to-end encryption, with the end user holding the keys, ensures that the cloud provider can’t independently decrypt data from disk. But the uses of data at rest are different, so the decoupling remedies must also be different.

Functional decoupling once again becomes just as important as organizational decoupling. We need decoupled infrastructure for authentication so that users can prove who they are, for authorization so that users can be given or denied access to data, for repositories that store raw data, and for applications that operate only on data the user permits them to access. Ideally, these functions would be decoupled across multiple providers, using standard protocols and programming interfaces to weave together seamless services for users.

We also must consider use cases. We store data in the cloud not only to retrieve it ourselves, but to share it with others. Many cloud systems that hold our data—whether Amazon’s Simple Storage Service (S3), Google Drive, or Microsoft 365, or analytics platforms, such as Intuit or Salesforce—provide the illusion of control, by giving customers tools for sharing. In reality, the cloud-storage provider still has complete access to and control over your data.

Here we need to decouple data control from data hosting. The storage provider’s job is to host the data: to make it available from anywhere, instantly. The hosting company doesn’t need to control access to the data or even the software stack that runs on its machines. The cloud software that grants access should put control entirely in the end user’s hands.

Modern protocols for decoupled data storage, like Tim Berners-Lee’s Solid, provide this sort of security. Solid is a protocol for distributed personal data stores, called pods. By giving users control over both where their pod is located and who has access to the data within it—at a fine-grained level—Solid ensures that data is under user control even if the hosting provider or app developer goes rogue or has a breach. In this model, users and organizations can manage their own risk as they see fit, sharing only the data necessary for each particular use.

By Invitation Only: How to keep private meetings private

Online services such as Zoom, Google Meet, and Microsoft Teams know who is meeting with whom, when and where they’re meeting, and what they’re saying because users’ devices are sending all video and audio data to the cloud, which can decrypt video streams internally and see participants’ IP addresses and identities. By using multiparty relays, end-to-end encryption, and oblivious authentication, a decoupled meeting service such as Booth prevents tech giants and hackers from snooping on private discussions.

Illustration of a process

StoryTK

Each meeting is assigned a randomly generated link. The host sends these to each meeting participant so that they—and only they—can join the meeting. The cloud service does not know who they are.

The meeting data stream, encrypted end-to-end between users, applies decoupling and is routed through third-party proxies. The multiparty relay scheme can authenticate each user without revealing their identities to the meeting service.

How decoupling can make computation more secure

Almost all cloud services have to perform some computation on our data. Even the simplest storage provider has code to copy bytes from an internal storage system and deliver them to the user. End-to-end encryption is sufficient in such a narrow context. But often we want our cloud providers to be able to perform computation on our raw data: search, analysis, AI model training or fine-tuning, and more. Without expensive, esoteric techniques, such as secure multiparty computation protocols or homomorphic encryption techniques that can perform calculations on encrypted data, cloud servers require access to the unencrypted data to do anything useful.

Fortunately, the last few years have seen the advent of general-purpose, hardware-enabled secure computation. This is powered by special functionality on processors known as trusted execution environments (TEEs) or secure enclaves. TEEs decouple who runs the chip (a cloud provider, such as Microsoft Azure) from who secures the chip (a processor vendor, such as Intel) and from who controls the data being used in the computation (the customer or user). A TEE can keep the cloud provider from seeing what is being computed. The results of a computation are sent via a secure tunnel out of the enclave or encrypted and stored. A TEE can also generate a signed attestation that it actually ran the code that the customer wanted to run.

With TEEs in the cloud, the final piece of the decoupling puzzle drops into place. An organization can keep and share its data securely at rest, move it securely in motion, and decrypt and analyze it in a TEE such that the cloud provider doesn’t have access. Once the computation is done, the results can be reencrypted and shipped off to storage. CPU-based TEEs are now widely available among cloud providers, and soon GPU-based TEEs—useful for AI applications—will be common as well.

How decoupling protects both privacy and security

One of the key benefits of decoupling is that it ensures there will be no single point of failure. If a cloud provider of a decoupled videoconferencing service is breached, all that’s visible is the flow of encrypted bytes to and from other nameless cloud servers. Same with storage: A breach reveals only a bunch of encrypted disks and encrypted flows of data. Same with compute: The hardware enclave shields the data in use from the attacker’s prying eyes.

The remaining risks are largely within each mode. The fact that decoupled storage feeds into decoupled compute doesn’t magnify the risk—but it’s worth thinking through in more detail.

Suppose Microsoft Azure is used to host a Solid pod, but it’s encrypted at rest and only decrypted within one of Azure’s secure enclaves. What can Microsoft or a hacker learn? The fact that Azure hosts both services does not give it much additional information, especially if data in motion is also encrypted to ensure that Microsoft doesn’t even know who is accessing that data. With all three modes decoupled, Azure sees an unknown user accessing an unknown blob of encrypted data to run unknown code within a secure enclave on Intel processors. This is exactly what an enterprise should want and expect from its cloud service providers: that they are no longer a breach risk even as they deliver the same useful cloud services as before.

“Self-regulation is a time-honored stall tactic. We need government policy that mandates decoupling-based best practices, a tech sector that implements this architecture, and public awareness of the benefits of this better way forward.”

Decoupling also allows us to look at security more holistically. For example, we can dispense with the distinction between security and privacy. Historically, privacy meant freedom from observation, usually for an individual person. Security, on the other hand, was about keeping an organization’s data safe and preventing an adversary from doing bad things to its resources or infrastructure.

There are still rare instances where security and privacy differ, but organizations and individuals are now using the same cloud services and facing similar threats. Security and privacy have converged, and we can usefully think about them together as we apply decoupling.

Decoupling also creates new opportunities: for companies to offer new services in a decoupled cloud ecosystem, for researchers to develop new technologies that can improve security and privacy, and for policymakers to ensure better security for everyone.

Decoupling isn’t a panacea. There will always be new, clever side-channel attacks. And most decoupling solutions assume a degree of noncollusion between independent companies or organizations. But that noncollusion is already an implicit assumption today: We trust that Google and Advanced Micro Devices will not conspire to break the security of the TEEs they deploy, for example, because the reputational harm from being found out would hurt their businesses. The primary risk, real but also often overstated, is if a government secretly compels companies to introduce backdoors into their systems. In an age of international cloud services, this would be hard to conceal and would cause irreparable harm.

How a Credit-Reporting Agency Should Work: Decoupling could thwart a privacy disaster

Some of the most pernicious risks we face today are from organizations that we have no choice in interacting with, such as credit-reporting agencies. Equifax, for example, had famously lax security, which allowed hackers to steal personal data on some 163 million people in 2017. Yet the company still holds some of the most sensitive personal financial data that exists.

Applying the decoupling principle to those credit records could ensure a far better outcome if the company were breached again. Attackers would not be able to identify any individuals nor read any credit facts because the compromised data would be encrypted and scattered across myriad, far-flung personal data stores.

Illustration of a process

StoryTK

DATA AT REST: Individuals and organizations could hold their credit data themselves, along with all their other personal data, in cloud repositories that they control and encrypt. New storage protocols such as Solid decouple the hosting provider from data access control and from applications. Individual credit facts (such as bank account numbers, reports from lenders, and so forth) could be cryptographically signed by those parties and supplied to the individual for storage in the repository. When applying for a loan, the user could then grant time-limited access to a specific organization for a specific application.

Illustration of a process.

StoryTK

DATA IN MOTION: Communications to and from the reporting agency’s servers should be decoupled by multiparty-relay protocols that build in blinding and encryption to conceal who is doing the communicating as well as the identity of the individual whose data is being analyzed.

DATA IN USE: A credit-analysis algorithm should run only in a secure enclave on the server—known as a trusted execution environment—so that the credit agency cannot see the data as it is processed. The user can check an attestation, perhaps via an external auditor, that only a permitted algorithm was used, but cannot see the proprietary code that was executed.

Rethinking Equifax

Decoupling doesn’t just benefit individual organizations or users: It also has positive ripple effects when properly applied. All of the decoupling we’ve talked about could lead to a better and very different outcome if Equifax were breached again, for example.

Imagine that individuals and organizations held their credit data in cloud-hosted repositories that enable fine-grained encryption and access control. Applying for a loan could then take advantage of all three modes of decoupling. First, the user could employ Solid or a similar technology to grant access to Equifax and a bank only for the specific loan application. Second, the communications to and from secure enclaves in the cloud could be decoupled and secured to conceal who is requesting the credit analysis and the identity of the loan applicant. Third, computations by a credit-analysis algorithm could run in a TEE. The user could use an external auditor to confirm that only that specific algorithm was run. The credit-scoring algorithm might be proprietary, and that’s fine: In this approach, Equifax doesn’t need to reveal it to the user, just as the user doesn’t need to give Equifax access to unencrypted data outside of a TEE.

Building this is easier said than done, of course. But it’s practical today, using widely available technologies. The barriers are more economic than technical.

Rethinking AI

As more organizations apply AI, decoupling becomes ever more important. Most cloud AI offerings—whether large language models like ChatGPT, automated transcription services from video and voice companies, or big-data analytics—require the revelation of troves of private data to the cloud provider. Sometimes organizations seek to build a custom AI model, trained on their private data, that they will then use internally. Sometimes organizations use pretrained AI models on their private data. Either way, when an AI model is used, the cloud service learns all sorts of things: the content of the prompts or data input, access patterns of the organization’s users, and sometimes even business use cases and contexts. AI models typically require substantial data, and that means substantial risk.

Once again, the three modes of decoupling can enable secure, cloud-hosted AI. Data, of organizations or ordinary people, can be held in a decoupled data store with fine-grained user control and mechanisms that decouple identity from usage. When the data needs to be processed, access can be explicitly granted for that purpose to allow the secure movement of the data from the store to a TEE. The actual AI training or operation on the user’s data can leverage GPU-based secure enclaves. Basically, a GPU TEE is like a CPU TEE, so nothing is leaked about the raw data.

How decoupling could lead to better policy

Why hasn’t this design philosophy been adopted widely? It’s hard to say for sure, but we think it’s because the enabling technologies— multiparty relay protocols, secure fine-grained data stores, and hardware-based TEEs—have matured only in the last few years. Also, security rarely drives business decisions, so even after the tech is available, adoption can lag.

Regulation, especially in the United States, is also lagging. What few data protections exist do not cover—or even clearly distinguish among—the three modes of decoupling. At the same time, it’s unreasonable to expect policymakers to make the first move. They can’t mandate something they don’t know is even possible. Technologists need to educate policymakers that potential solutions are in hand.

One of the challenges of trying to regulate tech is that industry incumbents push for tech-only approaches that simply whitewash bad practices. For example, when Facebook rolls out “privacy-enhancing” advertising, but still collects every move you make, has control of all the data you put on its platform, and is embedded in nearly every website you visit, that privacy technology does little to protect you. We need to think beyond minor, superficial fixes.

Decoupling might seem strange at first, but it’s built on familiar ideas. Computing’s main tricks are abstraction and indirection. Abstraction involves hiding the messy details of something inside a nice clean package: When you use Gmail, you don’t have to think about the hundreds of thousands of Google servers that have stored or processed your data. Indirection involves creating a new intermediary between two existing things, such as when Uber wedged its app between passengers and drivers.

The cloud as we know it today is born of three decades of increasing abstraction and indirection. Communications, storage, and compute infrastructure for a typical company were once run on a server in a closet. Next, companies no longer had to maintain a server closet, but could rent a spot in a dedicated colocation facility. After that, colocation facilities decided to rent out their own servers to companies. Then, with virtualization software, companies could get the illusion of having a server while actually just running a virtual machine on a server they rented somewhere. Finally, with serverless computing and most types of software as a service, we no longer know or care where or how software runs in the cloud, just that it does what we need it to do.

With each additional abstraction and layer of indirection, we’ve become further separated from true control of the underlying compute infrastructure. Meanwhile, we’ve gained operational benefits. And these operational benefits are key, even in the context of security: After all, denial of service is an attack on availability, making it a security issue even if there is no loss in confidentiality or integrity of data.

We’re now at a turning point where we can add further abstraction and indirection to improve security, turning the tables on the cloud providers and taking back control as organizations and individuals while still benefiting from what they do.

The needed protocols and infrastructure exist, and there are services that can do all of this already, without sacrificing the performance, quality, and usability of conventional cloud services.

But we cannot just rely on industry to take care of this. Self-regulation is a time-honored stall tactic: A piecemeal or superficial tech-only approach would likely undermine the will of the public and regulators to take action. We need a belt-and-suspenders strategy, with government policy that mandates decoupling-based best practices, a tech sector that implements this architecture, and public awareness of both the need for and the benefits of this better way forward.

This article appears in the December 2023 print issue.

{"imageShortcodeIds":[]}
The Conversation (2)
Bruce Anderson
Bruce Anderson23 Feb, 2024
LM

How will each decoupled party pay for the services it receives?

Bruce Wilson
Bruce Wilson31 Jan, 2024
M

A fine article. "This article appears in the December 2024 print issue." may be a typo.