DEV Community

Quantum Sequrity
Quantum Sequrity

Posted on • Originally published at quantumsequrity.com

What is Post-Quantum Cryptography?

What is Post-Quantum Cryptography?

Education

What is Post-Quantum Cryptography?

14 min read

What Cryptography Actually Protects

Before we talk about "post-quantum" anything, it helps to understand what cryptography does for you right now, every single day, without you ever thinking about it.

When you check your bank balance on your phone, cryptography prevents the person sitting next to you at the coffee shop from seeing your account number. When your doctor sends your blood test results to a specialist, cryptography keeps that data private as it crosses the internet. When a company stores its customer database, cryptography makes sure that a hacker who steals the hard drive gets a pile of unreadable noise instead of millions of credit card numbers.

Cryptography is not just about passwords or spy movies. It protects banking transactions (trillions of dollars per day flow through encrypted channels), medical records (governed by laws like HIPAA in the US and GDPR in Europe), government communications (military orders, diplomatic cables, intelligence reports), legal documents (attorney-client privilege relies on encrypted email and storage), and personal data (your photos, messages, browsing history, and location data).

All of this protection depends on mathematical problems that are very hard for computers to solve. The entire system works because breaking the math would take a conventional computer longer than the age of the universe. Post-quantum cryptography exists because quantum computers threaten to change that equation.

Key Point: Post-quantum cryptography runs on regular computers today. You do not need a quantum computer to use it. You need it to defend against quantum computers.

What Quantum Computers Actually Change

A quantum computer is not just a faster version of your laptop. It works on fundamentally different principles. Where a classical computer stores information as bits (each bit is either 0 or 1), a quantum computer uses qubits, which can exist in a superposition of both 0 and 1 simultaneously. This allows quantum computers to explore many possible solutions at the same time, rather than checking them one by one.

For most everyday tasks (browsing the web, editing documents, playing games), quantum computers offer no advantage. They are not universally faster. But for certain very specific mathematical problems, they are devastatingly effective.

The two quantum algorithms that matter for cryptography are:

  • Shor's Algorithm (1994): This can factor large numbers and compute discrete logarithms exponentially faster than any known classical algorithm. RSA, ECDH, DSA, and every other widely-used public-key algorithm today relies on one of these two problems. Shor's algorithm breaks all of them. An RSA-2048 key that would take a classical supercomputer 300 trillion years to break could fall in hours on a sufficiently large quantum computer.
  • Grover's Algorithm (1996): This provides a quadratic speedup for unstructured search problems. It effectively halves the security of symmetric algorithms like AES. AES-256, which has 256-bit security against classical computers, would have 128-bit security against a quantum attacker. 128-bit security is still considered strong enough to be safe, so AES-256 does not need to be replaced. The threat is specifically to public-key algorithms.

The critical distinction: Shor's algorithm does not just speed up existing attacks. It fundamentally changes the game. Problems that were computationally impossible become easy. This is why we cannot simply use bigger RSA keys. No matter how large you make the key, Shor's algorithm scales to crack it. We need entirely different math.

The Three Families of Post-Quantum Cryptography

Post-quantum cryptography (PQC) uses mathematical problems that remain hard even for quantum computers. There are three major families, each based on different math. This diversity is important: if one family turns out to have an unexpected weakness, the others provide backup.

1. Lattice-Based Cryptography

This is the most widely adopted family and includes the primary NIST standards (ML-KEM for encryption, ML-DSA for signatures). Lattice problems involve finding short or close vectors in high-dimensional geometric structures called lattices.

Think of a lattice as a grid of evenly spaced points, like the intersections on graph paper. In two dimensions, you can easily see the pattern and find the closest point to any location. But extend this to hundreds or thousands of dimensions, and the problem becomes extraordinarily difficult. No known quantum algorithm provides a significant speedup for lattice problems.

The specific problem used by ML-KEM is called "Module Learning With Errors" (Module-LWE). You are given a system of approximate equations (equations with small random errors added) and need to recover the hidden variables. The added noise makes this problem brutally hard in high dimensions. Lattice-based algorithms are fast and produce reasonably compact keys, which is why they won the NIST competition.

2. Code-Based Cryptography

Code-based cryptography uses error-correcting codes, the same mathematical structures used to fix transmission errors in wireless communications, satellite links, and data storage. The hard problem is "decoding a random linear code." You are given a noisy codeword and must figure out the original message, but without knowing the specific error-correcting structure that was used to encode it. This problem has been studied since the 1970s (the McEliece cryptosystem dates to 1978) and remains hard for quantum computers.

NIST selected HQC (Hamming Quasi-Cyclic) as a backup standard for key encapsulation. Code-based algorithms tend to have larger key sizes than lattice-based ones, but they offer cryptographic diversity. If lattice problems are somehow broken, code-based alternatives provide a safety net.

3. Hash-Based Signatures

Hash-based signatures are the most conservative approach. Their security depends only on the properties of hash functions (like SHA-3), which are well-understood and have been studied for decades. NIST standardized SLH-DSA (based on SPHINCS+) as FIPS 205 for exactly this reason: even if lattice math turns out to have unforeseen weaknesses, hash-based signatures remain secure.

The trade-off is that hash-based signatures are larger (tens of kilobytes per signature) and slower to generate than lattice-based signatures. They serve as a critical backup rather than a primary choice for high-volume operations.

The NIST Standardization Process

The standards for post-quantum cryptography did not come from a single company or research lab. They emerged from one of the most thorough and transparent evaluation processes in the history of cryptography.

In 2016, NIST (the U.S. National Institute of Standards and Technology) published a call for proposals. They asked the global cryptography community to submit algorithms that could resist quantum computers. Eighty-two teams from universities, government labs, and private companies worldwide responded.

Over the next eight years, NIST ran three rounds of public evaluation. During each round, cryptographers worldwide tried to break the submitted algorithms. They published papers, presented attacks, and debated trade-offs. Algorithms that showed weaknesses were eliminated. After Round 1 (2017-2019), 26 algorithms advanced. After Round 2 (2019-2020), 15 survived. After Round 3 (2020-2022), NIST selected the winners.

In August 2024, NIST published the first three final standards:

Standard Algorithm (Original Name) Family Purpose
FIPS 203 ML-KEM (CRYSTALS-Kyber) Lattice Key encapsulation (establishing shared secrets)
FIPS 204 ML-DSA (CRYSTALS-Dilithium) Lattice Digital signatures (proving identity and integrity)
FIPS 205 SLH-DSA (SPHINCS+) Hash-based Digital signatures (conservative backup)

Additional standards are in progress. FIPS 206 (draft) (FN-DSA, based on FALCON) covers lattice-based signatures with smaller signature sizes. NIST also selected HQC as a backup key encapsulation mechanism from the code-based family, providing algorithm diversity. The selection of algorithms from multiple mathematical families is a deliberate strategy: if one family of math turns out to be weaker than expected, the others still provide protection.

Post-Quantum vs. Quantum Cryptography

These two terms sound similar but refer to completely different things. Confusing them is one of the most common mistakes people make.

Term What It Is Hardware Required Practical Today?
Post-Quantum Cryptography Software algorithms that run on regular computers, using math that quantum computers cannot crack Your existing laptop, phone, or server Yes. NIST standards published. Widely deployed.
Quantum Cryptography (QKD) Uses quantum physics (individual photons) to distribute keys. An eavesdropper disturbs the quantum state, alerting the parties. Specialized quantum hardware, fiber optic links, single-photon detectors Limited. Point-to-point only. Very expensive. Cannot protect stored data.

Post-quantum cryptography is what you can use right now, on your existing hardware, to protect your files and communications. It is a software solution. Quantum Key Distribution (QKD) is a physics experiment that works over short distances between specialized equipment. QKD cannot protect a file sitting on your hard drive, cannot work over the general internet, and costs orders of magnitude more than software-based solutions.

How We Got Here: A Brief History

The story of post-quantum cryptography begins with a single paper. In 1994, mathematician Peter Shor, then at AT&T Bell Labs, published "Algorithms for Quantum Computation: Discrete Logarithms and Factoring." This paper demonstrated that a quantum computer could factor large integers and compute discrete logarithms in polynomial time. The implications were staggering: RSA, Diffie-Hellman, and all elliptic curve cryptography would be broken.

At the time, quantum computers were purely theoretical. No one had built one that could run Shor's algorithm on anything larger than trivially small numbers. But cryptographers recognized that if large quantum computers were ever built, the consequences for global security would be catastrophic. Every encrypted communication in the world would become retroactively vulnerable.

Research into "quantum-resistant" or "post-quantum" algorithms began almost immediately. Lattice-based cryptography, one of the primary post-quantum approaches, draws on mathematical work dating back to the 1990s. The Ajtai-Dwork cryptosystem (1997) and the NTRU encryption scheme (1998) were among the earliest practical proposals. Code-based cryptography is even older; the McEliece cryptosystem dates to 1978, predating the quantum threat by 16 years.

For the next two decades, post-quantum cryptography remained an active but niche research area. That changed in 2016, when NIST issued its formal call for proposals. The eight-year evaluation that followed brought post-quantum cryptography from the academic realm into the world of international standards. The publication of FIPS 203, 204, and 205 in August 2024 marked the transition from research to deployment.

The Hybrid Approach: Belt and Suspenders

Most security organizations recommend using post-quantum algorithms alongside classical algorithms, not as a replacement. This is called the "hybrid" approach, and it is the most conservative strategy available.

The logic is simple. Classical algorithms like X25519 have been studied for over 15 years and deployed in billions of devices. We are extremely confident they are secure against regular computers. Post-quantum algorithms like ML-KEM have passed rigorous evaluation and are believed to be secure against quantum computers, but they are newer and have seen less real-world deployment.

By combining both, you get protection that is at least as strong as the stronger of the two. If ML-KEM has a hidden flaw, X25519 still protects your data against classical attackers. If a quantum computer breaks X25519, ML-KEM still protects your data against quantum attackers. An adversary must break both to succeed.

NIST published specific guidance on hybrid approaches in SP 800-227. QNSQY implements this approach in every encryption: ML-KEM + X25519 for key encapsulation, ML-DSA + Ed25519 for digital signatures. Both key components are fed into a key derivation function so that the final encryption key depends on both algorithms being secure.

Common Myths About the Quantum Threat

"Quantum computers are decades away, so this is not urgent"

The quantum computer itself may be decades away, but the threat is already here. The "harvest now, decrypt later" strategy means adversaries can record encrypted data today and store it until quantum computers can decrypt it. If your data must remain secret for 15 years, and a quantum computer arrives in 15 years, you needed quantum-safe encryption yesterday. NIST began the standardization process in 2016 specifically because cryptographic transitions take many years. Waiting until quantum computers exist would leave a multi-year gap during which all encrypted data is retroactively vulnerable.

"We can just use bigger RSA keys"

Shor's algorithm breaks RSA in polynomial time. This means the attack scales efficiently regardless of key size. Doubling the RSA key size does not double the difficulty for a quantum attacker. It adds only a modest increase to the computation time. There is no RSA key size large enough to resist a quantum computer running Shor's algorithm. The mathematical structure that makes RSA work is the same structure that makes it vulnerable. The only solution is different math entirely, which is what post-quantum algorithms provide.

"AES-256 is enough"

AES-256 is quantum-safe as an encryption algorithm. Grover's algorithm only reduces its effective security from 256 bits to 128 bits, which is still strong. But AES-256 is a symmetric cipher. Both the sender and receiver must already possess the same secret key. The hard part is establishing that shared key over an insecure channel. This is where RSA, ECDH, and other public-key algorithms come in, and these are the algorithms that quantum computers break. ML-KEM replaces the quantum-vulnerable key agreement step. AES-256 handles the actual data encryption. They solve different problems and are both necessary.

"Only governments need to worry about this"

Government agencies are the most obvious targets, but healthcare organizations hold patient data that must remain confidential for decades under HIPAA. Law firms hold attorney-client privileged communications with no expiration date. Financial institutions hold transaction data and account information that criminals can monetize. Pharmaceutical companies hold drug research data worth billions. Any organization with long-lived sensitive data faces the same fundamental risk.

What Should You Do About It?

The urgency of switching to post-quantum cryptography depends on how long your data needs to stay secret. Think about three questions:

  1. How long does this data need to remain confidential? Medical records might be sensitive for 50+ years. A corporate strategy document might matter for 5 years. A credit card number expires in 3 years.
  2. How soon could a quantum computer break current encryption? Expert estimates range from 10 to 20 years, though some believe it could happen sooner.
  3. Could someone be recording this data now to decrypt later? Nation-state intelligence agencies have both the motivation and the storage capacity to do this.

If the sensitivity lifetime of your data exceeds the expected arrival of quantum computers, you should already be using post-quantum cryptography. For most sensitive data (medical records, legal documents, government secrets, long-term business plans), the math says the time to switch is now.

Practically, transitioning does not have to be painful. Tools like QNSQY use post-quantum algorithms by default, with no configuration required. You encrypt a file, and it automatically receives ML-KEM + X25519 hybrid protection. The free tier provides full post-quantum encryption with ML-KEM-512.

The Performance Question

A common concern about post-quantum cryptography is performance. Post-quantum algorithms generally have larger keys and signatures than their classical counterparts. An ML-KEM-768 public key is 1,184 bytes, compared to 32 bytes for X25519. An ML-DSA-65 signature is 3,293 bytes, compared to 64 bytes for Ed25519.

However, the computational speed tells a different story. ML-KEM key generation and encapsulation are faster than RSA operations at comparable security levels. On modern hardware, an ML-KEM-768 encapsulation takes approximately 40 microseconds. AES-256-GCM data encryption, which runs at several gigabytes per second on hardware with AES-NI support, dominates the total encryption time for any file larger than a few kilobytes. The post-quantum key agreement step is a one-time cost per encryption, and it is negligible compared to the data encryption itself.

For network protocols like TLS, the larger key sizes add bytes to the handshake, but real-world deployments (Google Chrome, Cloudflare) have confirmed that the impact on page load times is imperceptible to users. The additional 1-2 kilobytes in the TLS handshake are absorbed within the noise of normal network latency.

For data encryption specifically, the overhead is even less significant. QNSQY stores the ML-KEM ciphertext and X25519 public key in the file header, adding roughly 1,200 bytes to the total file size. For a 1 MB file, that is 0.12% overhead. For a 100 MB file, it rounds to zero.

Sources

  1. NIST FIPS 203: Module-Lattice-Based Key-Encapsulation Mechanism Standard
  2. NIST FIPS 204: Module-Lattice-Based Digital Signature Standard
  3. NIST FIPS 205: Stateless Hash-Based Digital Signature Standard
  4. NIST IR 8105: Report on Post-Quantum Cryptography
  5. NIST Post-Quantum Cryptography Standardization Project

Related Articles

Learn More About Security


Originally published at quantumsequrity.com.

Top comments (0)