Data Is Vulnerable to Quantum Computers That Don’t Exist Yet

A new spin-off from Alphabet has a plan to transition to postquantum cryptography

5 min read
Pixelated key and lock that are disintegrating
Donald Iain Smith

Future quantum computers may rapidly break modern cryptography. Now a new spin-off from Google’s parent company Alphabet warns that sensitive data is already vulnerable to quantum computers that don’t exist yet, courtesy of codebreaking attacks that steal that data now and could decrypt it in the future. Therefore, it has developed a road map to help businesses, governments and other organizations begin the shift to post-quantum cryptography now.

The new startup, Sandbox AQ (which stands for AI and quantum), has already attracted clients including Mount Sinai Health System, telecommunications firm Softbank Mobile, communications technology company Vodafone Business, and Web developer Wix. It has also reeled in investors including the CIA’s venture capital arm In-Q-Tel and cybersecurity-focused investment firm Paladin Capital Group. Former Google CEO Eric Schmidt is serving as the chairman of its board of directors.

In addition, Sandbox AQ has already partnered with two of the world’s largest professional service firms, Ernst & Young and Deloitte, to help deploy post-quantum cryptography.

“These firms have the scale to educate, engage, and upgrade post-quantum cryptography for their Global 1000 clients, which represent the world’s largest and most successful companies,” says David Joseph, a research scientist at Sandbox AQ in Palo Alto, Calif. “Doing this will multiply the impact of our quantum solutions and help companies protect their customers, data, networks, and other assets today, without having to wait until error-corrected quantum computers become available.”

Quantum computers theoretically can quickly solve problems it might take classical computers untold eons to solve. For example, much of modern cryptography depends on the extreme difficulty that classical computers face with regard to mathematical problems such as factoring huge numbers, but quantum computers could in principle rapidly crack even highly secure RSA-2048 encryption.

To stay ahead of quantum computers, scientists around the world have spent the past two decades designing post-quantum cryptography (PQC) algorithms. These are based on new mathematical problems that both quantum and classical computers find difficult to solve. In January, the White House issued a memorandum on transitioning to quantum-resistant cryptography, underscoring that preparations for this transition should begin as soon as possible.

However, after organizations such as the National Institute of Standards and Technology (NIST) help decide which PQC algorithms should become the new standards the world should adopt, there are billions of old and new devices that will need to get updated. Sandbox AQ notes that such efforts could take decades to implement.

Although quantum computers are currently in their infancy, there are already attacks that can steal encrypted data with the intention to crack it once codebreaking quantum computers become a reality. Therefore, the Sandbox AQ argues that governments, businesses, and other major organizations must begin the shift toward PQC now.

For example, in a store-now-decrypt-later attack, adversaries would capture precious encrypted information now, store it, and decrypt it when practical quantum computers exist. Stolen data could include medical records, national security documents, trade secrets, and more—any information that may still prove valuable even decades later.

“We know for a fact that store-now-decrypt-later attacks are happening right now, and their frequency will only increase the closer we get to delivering a fault-tolerant quantum computer,” Joseph says. “Once encrypted data has been exfiltrated, there is no way to protect it from future decryption and exploitation.”

Store-now-decrypt-later attacks do not need high-profile breaches to succeed. “They could be performed silently by first observing encrypted data on public networks, which would be very difficult to detect,” Joseph says. “Over the public Internet, encrypted data might be sent via many different nodes, and any one of these nodes could be compromised, copying and storing valuable data before forwarding it on to its intended final destination.”

The main difficulty in executing store-now-decrypt-later attacks is figuring out which data to target, “as there will be an enormous volume of encrypted data and only a finite amount of quantum computing resources,” Joseph says. “We expect the first quantum-enabled adversaries will be nation-states, and it may not be public knowledge exactly when one of them gains access to a large, fault-tolerant device capable of breaking RSA-2048.”

Another reason shifting to post-quantum cryptography may prove important is because of projects that are getting designed and planned now but may have life spans of decades, such as many cars, planes, trains, and ships in production now, or critical national infrastructure projects. The hardware needed to implement cryptography may essentially remain immutable for the lifetime of these products and projects, so the earlier they can get protected, the better, Joseph and his colleagues note.

The inspiration to launch Sandbox AQ grew from discussions between security teams within Alphabet starting from 2016.

“It became apparent that there was a huge wealth of experience across the now–Sandbox AQ team and the Googlers across Alphabet, but most of this expertise was focused on distinct, introspective efforts that directly benefited Google’s customers,” Joseph says. “However, when we spoke with decision-makers at external organizations, it became clear that what was ‘common knowledge’ in the security community was not well known at large.”

Sandbox AQ’s efforts to explain the importance of PQC led the startup to draft a new road map for organizations to shift past traditional cryptography. The company detailed its road map on 11 May in the journal Nature.

The first recommendation Sandbox AQ makes is to figure out where PQC transition is needed first. The workforce to perform these upgrades is highly specialized and usually scarce, and so needs to get deployed in ways that make the most of resources to protect systems best. This involves identifying the cryptographic schemes that are at highest risk, such as key exchange algorithms, the kind that often underlie secure messages and data transfers.

Instead of replacing existing algorithms with relatively untested PQC alternatives, the road map notes that scientists have developed hybrid algorithms combining both traditional algorithms and post-quantum algorithms. Therefore, even if the PQC algorithm later proves flawed, at least the classical algorithm can still provide a measure of security.

The road map notes that NIST’s PQC project is close to the end of its third round and standards for the algorithms selected are expected to be released no later than 2024. Sandbox AQ recommends that organizations may want to start experimenting now with the finalist and alternative candidates. The company also suggests considering stateful hash-based signature technology for applications such as software code signing, as NIST and other bodies have already standardized it.

The most comprehensive repository of the software implementations of the NIST schemes is Liboqs of the Open Quantum Safe project. Other resources include BoringSSL, Tink, and SUPERCOP.

Due to the current diversity of PQC alternatives, the need to change from one algorithm to another in case of a successful attack, and the desire for increasing connectedness between systems, the road map also recommends “crypto-agility,” or the ability to switch between cryptographic schemes. Sandbox AQ notes that standards bodies should make 6G wireless technologies, for example, inherently crypto-agile and PQC-compatible.

Sandbox AQ also helps organizations shift to PQC by conducting three-phase security audits. “The first phase is discovery, where we assess and catalog the organization’s cryptographic infrastructure to understand where any potential vulnerabilities lie,” Joseph says. “We then conduct a performance analysis in order to provide a quantum readiness evaluation and risk-based PQC migration plan.”

The next step “is the assessment phase, migrating selected candidates from IT infrastructure to demonstrate functional success and performance,” he says. “We catalog mitigation patterns that will become the standard for the full implementation.”

Finally comes “the implementation phase, which includes the complete transition of an organization’s IT infrastructure, in order of priority,” Joseph says. “It enables cryptographic agility throughout the network and enables full sovereignty over cryptographic usage.”

As dangerous as code breaking quantum computers may prove, history shows that cryptography transitions need a considerable amount of time. For example, elliptic curve cryptography was proposed in the 1980s, and despite the fact that it is far more efficient than RSA in terms of space and speed, it took more than two decades to finally gain widespread adoption.

“By comparison, the transition to PQC will be larger and more complex,” Joseph says. “From this frame of reference, it became clear that awareness needs to be increased and the transition process needs to start now.”

The Conversation (2)
John Lin13 Jun, 2022

Time will show that just like crypto, NFT and other made up, hysteria and greed driven "dangers" - PQC will end up as nothing. Should we really protect ourselves from a technology that does not exist? by the time quantum computers may be viable, better cryptography or technology algorithm would be available which would render any effort on PQC today a waste of time and money. Pseudo science...

Todd Hubers Van Assenraad15 Jun, 2022

The TLS standard needs to incorporate immature PQC ciphers and processes. They should be combined with mature ciphers. If the mature cipher is cracked, there is a chance that the PQC cipher will protect. If the immature PQC cipher fails due to a theoretical/implementation weakness, only the QC protection is lost.

Get PQC into the mainstream. Then refine.

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
A plate of spaghetti made from code
Shira Inbar

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less