DEV Community

Quantum Sequrity
Quantum Sequrity

Posted on • Originally published at quantumsequrity.com

Classical vs Quantum-Safe Encryption Compared

Classical vs Quantum-Safe Encryption Compared

Education

Classical vs Quantum-Safe Encryption: Full Comparison

11 min read

Why This Comparison Matters Now

With NIST finalizing the first post-quantum cryptography standards on August 13, 2024 (FIPS 203, FIPS 204, and FIPS 205), organizations face a concrete decision: continue relying solely on classical cryptography, or begin transitioning to quantum-safe algorithms. Making that decision requires understanding the real differences between these two families of algorithms, not just in theoretical security, but in key sizes, performance, maturity, and practical deployment considerations.

This post provides a side-by-side comparison of classical encryption (RSA, ECC, Ed25519, X25519) and the NIST-standardized post-quantum replacements (ML-KEM, ML-DSA). If you are new to post-quantum cryptography, our introduction to what post-quantum cryptography is provides useful background.

Overview Comparison

The following table summarizes the key differences across the most important dimensions:

Aspect Classical (RSA / ECC) Post-Quantum (ML-KEM / ML-DSA)
Security basis Integer factorization (RSA), elliptic curve discrete logarithm (ECC) Module Learning With Errors (lattice problems)
Quantum resistance No -- broken by Shor's algorithm Yes -- no known efficient quantum attack
Maturity Decades of deployment and analysis (RSA: 1977, ECC: 1990s) Newly standardized (FIPS 203/204: August 2024), underlying math studied since ~2005
Standards FIPS 186-5 (DSA/ECDSA), PKCS#1 (RSA), RFC 7748 (X25519), RFC 8032 (Ed25519) FIPS 203 (ML-KEM), FIPS 204 (ML-DSA), FIPS 205 (SLH-DSA)
Key sizes Small (32-512 bytes) Larger (800-2,592 bytes)
Ciphertext / signature sizes Small (64-512 bytes) Larger (768-4,627 bytes)
Performance ECC very fast; RSA signing fast, verification fast ML-KEM encapsulation/decapsulation very fast; ML-DSA signing/verification fast
Widely deployed Yes -- virtually all internet infrastructure Early adoption phase; growing library and protocol support

Security Basis: What Makes Them Hard to Break

Classical: Factoring and Discrete Logarithms

RSA's security relies on the difficulty of factoring the product of two large prime numbers. Given a 2048-bit modulus n, finding its prime factors p and q is computationally infeasible for classical computers. The best classical factoring algorithms (General Number Field Sieve) run in sub-exponential time.

ECC-based schemes (ECDH, ECDSA, Ed25519, X25519) rely on the elliptic curve discrete logarithm problem (ECDLP). Given a point on an elliptic curve that is the result of multiplying a base point by an unknown scalar, finding that scalar is computationally hard. ECC achieves equivalent security to RSA with much smaller keys: 256-bit ECC is roughly equivalent to 3072-bit RSA.

Both problems are broken in polynomial time by Shor's algorithm on a quantum computer. For a deeper explanation of how this works, see our post on why quantum computers threaten classical encryption.

Post-Quantum: Lattice Problems

ML-KEM and ML-DSA are based on the Module Learning With Errors (MLWE) problem, a variant of the Learning With Errors (LWE) problem applied to polynomial modules over lattices. The underlying hardness assumption is that finding short vectors in high-dimensional lattices is computationally intractable, even for quantum computers. Lattice problems have been studied by mathematicians since the 19th century, and no quantum algorithm is known to solve them significantly faster than the best classical algorithms.

SLH-DSA (FIPS 205) takes a different approach: its security is based solely on the properties of hash functions (specifically, the security of the hash function family used). This makes SLH-DSA extremely conservative, as it relies on one of the most well-understood building blocks in all of cryptography.

Key Size Comparison

One of the most visible differences between classical and post-quantum algorithms is key and ciphertext/signature sizes. Post-quantum algorithms generally require larger keys. Here are the exact sizes from the NIST standards:

Key Encapsulation (Key Exchange)

Algorithm Public Key Ciphertext Shared Secret Security Level
X25519 32 bytes 32 bytes 32 bytes ~128-bit classical
RSA-2048 256 bytes 256 bytes Varies ~112-bit classical
RSA-4096 512 bytes 512 bytes Varies ~140-bit classical
ML-KEM-512 800 bytes 768 bytes 32 bytes 128-bit quantum (NIST Level 1)
ML-KEM-768 1,184 bytes 1,088 bytes 32 bytes 192-bit quantum (NIST Level 3)
ML-KEM-1024 1,568 bytes 1,568 bytes 32 bytes 256-bit quantum (NIST Level 5)

Digital Signatures

Algorithm Public Key Signature Security Level
Ed25519 32 bytes 64 bytes ~128-bit classical
RSA-2048 256 bytes 256 bytes ~112-bit classical
ML-DSA-44 1,312 bytes 2,420 bytes 128-bit quantum (NIST Level 2)
ML-DSA-65 1,952 bytes 3,309 bytes 192-bit quantum (NIST Level 3)
ML-DSA-87 2,592 bytes 4,627 bytes 256-bit quantum (NIST Level 5)

Key Size Context
While ML-KEM and ML-DSA keys are larger than ECC keys, they are comparable to or smaller than RSA keys at equivalent security levels. ML-KEM-768 (1,184-byte public key) is still smaller than many RSA-4096 key structures used in practice. The size increase is a manageable trade-off for quantum resistance.

Performance

Performance is often cited as a concern when discussing post-quantum algorithms, but the reality is more nuanced than many expect:

Key Encapsulation Performance

ML-KEM is fast. In benchmarks, ML-KEM-768 key generation, encapsulation, and decapsulation each complete in microseconds on modern hardware. This is significantly faster than RSA key generation (which involves finding large primes) and comparable to or faster than X25519 for the encapsulation/decapsulation operations. The lattice-based arithmetic in ML-KEM involves matrix-vector multiplications over small polynomial rings, which maps efficiently to modern CPU instructions.

Digital Signature Performance

ML-DSA signing and verification are also fast, completing in microseconds on modern CPUs. Signing speed is comparable to Ed25519, and verification is fast as well. RSA signing is fast but RSA verification, while also fast for small public exponents, involves working with much larger numbers.

The performance difference you are most likely to notice in practice is not in computation but in bandwidth and storage: larger keys and signatures mean more data transmitted and stored. For network protocols like TLS, the larger handshake sizes add some latency, particularly on constrained or high-latency connections. For data encryption (the primary use case for QNSQY), the overhead is negligible relative to the file data itself.

Maturity and Confidence

Classical: Decades of Battle-Testing

RSA has been in production use since the 1980s. Diffie-Hellman was published in 1976. ECC adoption grew through the 2000s and 2010s. These algorithms have been scrutinized by thousands of researchers over decades. Vulnerabilities that have been found (such as padding oracle attacks on RSA PKCS#1 v1.5, or weak curve parameters) have been addressed through improved implementations and updated standards. The mathematical hardness assumptions have held up against classical attacks.

This depth of analysis provides high confidence in classical algorithms against classical threats. The problem, of course, is that this confidence does not extend to quantum threats.

Post-Quantum: New Standards, Established Math

ML-KEM and ML-DSA are newly standardized (August 2024), but the underlying mathematical problems are not new. The Learning With Errors problem was introduced by Oded Regev in 2005 and has been extensively studied. The CRYSTALS-Kyber submission (which became ML-KEM) was introduced in NIST's 2016 call for proposals and underwent eight years of public review, analysis, and multiple rounds of evaluation before standardization.

No practical attack has been found against ML-KEM or ML-DSA at their specified parameter sets. However, the shorter track record compared to RSA or ECC means there is inherently less accumulated confidence. This is precisely why the hybrid approach is recommended during the transition period.

Why Hybrid Is the Best Approach

Given the trade-offs between classical maturity and post-quantum resistance, combining both in a hybrid scheme provides the strongest overall security guarantee:

  • If classical algorithms are broken by quantum computers (expected), the post-quantum component (ML-KEM or ML-DSA) provides protection.
  • If a post-quantum algorithm is found to have a weakness (unlikely but possible with newer algorithms), the classical component (X25519 or Ed25519) provides protection against non-quantum attackers.
  • An attacker must defeat both algorithms simultaneously to compromise the data. The overall security level is at least as strong as the stronger of the two components.

QNSQY implements hybrid encryption for all operations: ML-KEM + X25519 for key encapsulation, and ML-DSA + Ed25519 for digital signatures. For how hybrid encryption works, see why hybrid encryption matters. For the ML-KEM side of the hybrid, see ML-KEM explained.

Migration Considerations

Moving from classical to post-quantum cryptography involves practical considerations beyond algorithm selection:

Protocol and Format Changes

Larger keys and signatures affect protocol message sizes. TLS handshakes become larger. Certificate chains grow. File headers expand. Systems with hard-coded buffer sizes or strict payload limits may need updates. Testing interoperability across updated and non-updated systems is essential.

Key Management

Post-quantum keys are larger and may require updates to key storage systems, hardware security modules (HSMs), and key distribution protocols. Organizations using smart cards or constrained devices should evaluate whether those devices can handle the larger key sizes.

Compliance and Standards Alignment

FIPS 203 and FIPS 204 are now official NIST standards. Organizations subject to FIPS compliance (government agencies, contractors, healthcare, financial services) should incorporate these into their cryptographic policies. NIST has also published transition guidance recommending a phased approach.

Incremental Adoption

Full infrastructure migration takes years. A practical approach is to start with hybrid mode: deploy post-quantum algorithms alongside classical ones. This provides immediate quantum resistance for new data while maintaining backward compatibility and avoiding a disruptive cutover.

For data encryption, the migration path is more straightforward. Tools like QNSQY allow you to encrypt individual files with hybrid PQC today, without changing your entire infrastructure. Files encrypted now with ML-KEM + X25519 will remain secure against both classical and quantum attacks for the foreseeable future.

Real-World Deployment Status

Post-quantum algorithms are no longer theoretical. Real-world deployment is already underway across major platforms and protocols.

TLS and Web Browsers

Google Chrome and Cloudflare began experimental deployments of hybrid key exchange using ML-KEM (then called Kyber) in TLS 1.3 connections during 2023 and 2024. These experiments demonstrated that hybrid handshakes with ML-KEM-768 + X25519 complete successfully at scale, with the primary impact being a modest increase in handshake size (roughly 1,100 additional bytes for the KEM ciphertext). Connection latency impact was minimal on broadband connections but measurable on high-latency mobile networks.

Signal Protocol

The Signal messaging application updated its X3DH key agreement protocol to include a post-quantum component (PQXDH) using ML-KEM-768 in September 2023. This was one of the first major consumer-facing deployments of post-quantum cryptography, protecting billions of messages against future quantum decryption.

Government Adoption

The U.S. government, through NSA CNSA 2.0, has mandated ML-KEM-1024 and ML-DSA-87 for all National Security Systems, with phased deadlines beginning in 2025. Federal agencies are actively conducting cryptographic inventories under OMB Memorandum M-23-02 and beginning migration of their highest-priority systems.

Library Support

Major cryptographic libraries have added PQC support. The Open Quantum Safe (liboqs) project provides C implementations of all NIST-standardized algorithms. OpenSSL 3.x includes PQC provider support. BoringSSL (used in Chrome and Android) has integrated ML-KEM. AWS, Azure, and Google Cloud have begun offering PQC-enabled TLS endpoints for their services. For organizations seeking to protect files at rest without infrastructure changes, standalone tools like QNSQY provide hybrid PQC encryption that can be deployed immediately alongside existing systems.

Summary: Which Should You Use?

Scenario Recommendation
New systems or applications Use hybrid: ML-KEM + X25519 (key exchange), ML-DSA + Ed25519 (signatures)
Data that must remain confidential 10+ years Encrypt now with post-quantum or hybrid algorithms
Legacy systems that cannot be updated Prioritize migration planning; use PQC at the boundaries where possible
Short-lived data (session keys, ephemeral tokens) Classical ECC is acceptable today, but hybrid adds zero-cost future-proofing
Regulatory/compliance-driven environments Follow NIST FIPS 203/204 guidance; begin hybrid deployment

The bottom line: classical cryptography served us well for decades but does not survive the quantum transition. Post-quantum algorithms are standardized, performant, and ready for deployment. Hybrid mode combines the strengths of both. There is no technical reason to delay adoption.

Sources

Related Articles

Start Using Quantum-Safe Encryption Today

QNSQY provides hybrid ML-KEM + X25519 and ML-DSA + Ed25519 encryption across all tiers.


Originally published at quantumsequrity.com.

Top comments (0)