What Is the Future of Quantum-Proof Encryption?

Bright, according to officials at NIST’s Post-Quantum Cryptography program

7 min read
Illustration of a tree with algorithms decorating it, and a cube and other lattice shapes.

The first four algorithms that NIST has announced for post-quantum cryptography are based on structured lattices and hash functions, two families of math problems that could resist a quantum computer’s assault.

N. Hanacek/NIST

On Tuesday, the National Institute of Standards and Technology (NIST) announced its first quantum-resistant algorithms—new encryption that will become the standard to guard against attacks by quantum computers, which are not yet here. The four algorithms are CRYSTALS-Kyber, for general encryption, and three schemes for digital encryption: CRYSTALS-Dilithium, FALCON, and SPHINCS+.

Over the past few decades, NIST has managed encryption standards, introducing and vetting the schemes that protect and authenticate valuable digital information—from bank transactions to emails to your Netflix password. These encryption schemes are easy for the user to encode and decode, but hard for an attacker to break. This one-way functionality is like mixing colors: It’s easy to mix shades of blue and yellow to make green, but hard to tell by looking at a green which shades were used to create it.

While these methods have been robust against classical attacks, they are known to be vulnerable to quantum algorithms. Quantum computers capable of breaking existing encryption with these algorithms are a ways off, but researchers say there’s no time to wait. Post-quantum cryptography is the future, and it’s here now.

Last week, we spoke with Dustin Moody, a mathematician at NIST leading the post-quantum cryptography standardization process.

What is post-quantum cryptography, and why does NIST need a standardization process for it?

Dustin MoodyNIST

Dustin Moody: Researchers from a variety of backgrounds have been working on building what’s called a quantum computer. If a quantum computer is built that is large enough, there’s an algorithm you could run on this quantum computer that would break several of the most widely used cryptosystems that we have implemented and use today around the world. So post-quantum crypto is preparing new cryptosystems to replace those that would be vulnerable to attacks from a big enough quantum computer. And NIST is doing this because a few of the cryptosystems that we have standardized, namely the ones dealing with public-key cryptography, would be vulnerable. So we want to replace those standards with new ones that would not be vulnerable to attacks from a quantum computer.

“You can actually be at risk from a quantum computer, even though a [high-performance] quantum computer does not yet exist. This is often called ‘harvest now, decrypt later.’”
—Dustin Moody, NIST

Now there’s two separate things you’re replacing, as I understand it; there’s both a key and a signature. Can you talk about both of those—and the difference between the two?

Moody: With cryptography, you have many different kinds of tools that are needed to protect your information. One of these that people first think of is just encryption and scrambling your data, so your enemy can’t read it. That is typically enabled using public-key encryption. You then create a shared secret that you will actually use for a key for another type of cryptosystem that can do encryption a lot faster, a block cipher called AES. But this public-key encryption is used to create that initial key. So that’s the first functionality that we need to replace. The second is digital signatures. Instead of signing your name at the bottom of the letter to prove that you are the author, you can digitally do the same thing using cryptographic techniques. And that is also known to be vulnerable to a quantum computer. So it’s the second functionality that we’re trying to replace.

The largest quantum computer is about 200-something qubits right now, and they’re not fault-tolerant. The largest number that one of these machines has factored with Shor’s algorithm is 21. There are various estimates out there for what it would take to crack RSA 2048, but you would need thousands of qubits at a minimum, and they would all need to be very, very high quality. What’s the rush? Why are we doing this now? You started this process in 2016. This is years, if not decades, ahead of time.

Moody: First, standardizing cryptosystems and then getting them into products around the world takes a surprisingly long amount of time. From our past experience, switching from one crypto algorithm to another can take 10 to 20 years. And since we need to have this done before a quantum computer comes out, we've got to get all that work done ahead of time. And it can’t be rushed too quickly. You don’t have confidence in a cryptosystem if it was invented two weeks ago. We need years of people analyzing these algorithms to have confidence in their security. That’s reason one: It takes time to get ready. And then reason two is you can actually be at risk from a quantum computer, even though a quantum computer does not yet exist. This is often called “harvest now, decrypt later.” It’s the idea that your enemy could copy your data, which is encrypted, and they can hold onto it right now. They can’t read it. But maybe a quantum computer comes out in 10 years, and then they can get access to your data. If the information you’re protecting is valuable enough, then you’re already in trouble because of that threat. So that’s another reason that we need to be well ahead of the time that a quantum computer is built.

How real is harvest now, decrypt later? I understand that there is the possibility, but is this really happening? Are people downloading data and squirreling it away so that when they get their quantum computers up and running, they can go through it?

Moody: So, I don’t have access to classified material. NIST is not at that level, like the NSA and other people do that. But people I have talked to say that this is absolutely a threat and that it is occurring now at the nation-state level.

“We’re going to standardize a number of things so that we have a diversity of different mathematical problems to base our security on.”
—Dustin Moody, NIST

The vulnerability in public-key encryption is because of quantum computers’ ultimate ability to have almost exponentially faster factoring of large prime numbers. What types of strategies are being used to create these new public keys? Can you run through a few of them?

Moody: In crypto, we like to use ideas that have been around for a while. Since [Peter] Shor’s algorithm was discovered back in the ’90s, researchers have been looking into this. There are three big families where a lot of the solutions are coming from. The most popular one involves what are called lattices. This is a mathematical structure. You take basis vectors; you take integral linear combinations of them. And you can do some pretty interesting things cryptographically. There are no known quantum algorithms that do better than the classical attacks on cryptosystems based on lattices. Cryptosystems based on lattices seem to be the top contenders in terms of key size and efficiency. So it wasn’t surprising that we received a lot of lattice submissions. The second family is based on what are called error-correcting codes or code-based cryptography. These codes have been used in information security for a long time, because data gets sent on noisy channels. If you use error-correcting codes, you can account for the error and recover the message that was originally meant to be sent. We use ideas similar to what’s used in lattices with these codes. And there, they seem to be just a half-step behind the lattices in terms of key sizes and efficiency, but they’re pretty good as well. So we saw a number of code-based submissions. The third biggest family includes some signature schemes based on what’s called multivariate cryptography. You’ll use a multitude of variables, x1 to xn, and create a system of quadratic equations. And it’s very easy to define and evaluate one of these systems, but very hard to solve. So it turns out that works well for digital-signature schemes.

Sort of like if you had a big squiggly line on a graph? It’s easy to draw, but it’s hard to actually figure out what function was used to plot it.

Moody: Yeah, that’s the idea in a lot of cryptosystems. There’s a one-way thing—if you know the secret, you can do it quickly and easily. That allows you to decrypt or sign. An attacker doesn’t know that secret, and they can’t do anything.

A signature system called Rainbow was recently broken. I understand roughly how you might test a classical encryption system, because we have classical computers. But we don’t have quantum computers. So how do you verify that these methods are actually quantum secure?

Moody: There’s no guarantee that no one really is going to come along and come up with some new idea. That’s the case with the cryptosystems we use today in classical computers as well.

Sure, I know RSA isn’t provably hard.

Moody: When you add quantum computers to the mix, you get access to more tools, you get access to more quantum algorithms. And the best we can say is that a lot of smart people spend time looking at these, nobody has found any attacks, and they don’t seem to be getting close. The avenues don’t appear to be useful at all. That’s how we set up our competition—make sure that we had experts in the field, spend a number of years looking at these cryptosystems, and see that there are no attacks. And actually, Rainbow—it wasn’t a quantum attack that broke it; it was a classical attack. So these things have to be secured from both quantum computers as well as all the classical attacks as well.

So at this point in the process, you really have narrowed it down to a couple of finalists from an initial selection. [Editor’s note: This took place prior to the July 5th announcement.]

Moody: Yeah, we started out with 82. And right now we have seven finalists and eight alternates at the end of round three, which is going to end within a very short time, probably a week or two, or we’ll have ones that we will standardize.

What happens then?

Moody: We’re going to begin writing standards. These will be the algorithms that will be most widely used. We will also name a few that we’ll continue on into a fourth round of evaluation so that we have a few other algorithms. Because this is a new research field, we don’t want to put all our eggs in one basket and only have lattice algorithms, and then an attack comes along and we don’t have anything else. We’re going to standardize a number of things so that we have a diversity of different mathematical problems to base our security on. We’ll start writing the standards, which will take a year or two, and there’ll be a few that continue to be evaluated and might be selected for standardization at the end of the fourth round.

What’s the difference between finalists and those getting standardized?

Moody: Once we’ve named these algorithms, people know that these are going to be the ones that are gonna be in widespread use. There already are implementations, but people will be able to focus on the standardized ones. It’s an important step on the road to adoption. The industry needs standards before adoption can really get widespread.

I understand that there are some patent disputes related to the algorithms.

Moody: In crypto, most people don’t like patents, because they tend to hinder adoption. Companies don’t like to adopt algorithms that might be patented. At the beginning of our process, all the submitters had to let us know of any patents that they had. We were aware of some patents, which were not from the submitters—they were from some third parties. NIST has been in talks with these other parties to negotiate a solution that would be beneficial to both parties. I think you’ll see with our announcement that we’re happy with the outcome. We feel that the algorithms will be able to be widely adopted by people around the world.

The Conversation (1)
Anna Johnston
Anna Johnston15 Jul, 2022
INDV

Several questions and concerns:

1. The largest quantum computer is 127-qubits (IBM Eagle)

2. Error correction is the biggest hurdle. We haven't been able to create a single logical qubit with the time capacity needed for Shor's algorithm. In 2021 errors in qubits were found to be correlated due to gamma/cosmic ray interference, making the unsolved problem even more difficult.

3. Time frame for a QC that can attack smaller EC based systems: no one knows when, or even if, it will happen. Yet somehow the 'if' is frequently ignored.

4. 'Harvest now, decrypt later' is a VERY old scheme (see NSA article on VENONA: material collected in 1940's and decrypted decades later), independent of quantum, and a weak argument. By this argument, we should never send any sensitive data due to the chance it could be stored and broken by some as of yet undiscovered tech.

5. As far as existing PKC, finite field discrete logarithm based are the most resistant against Shor's algorithm, while elliptic curve based are the least. If NIST is so concerned about security, why aren't we 'shoring' up our security by switching to finite field based systems?