Some days you open Hacker News and find a post that makes you feel like you know almost nothing about something you thought you had a decent grasp on. This week was one of those days.
A quantum cryptographer posted a technical analysis on the current state of quantum computing applied to cryptography. 289 points. 340 comments. I understood maybe 30% of the thread. And I'd already spent over an hour trying to follow it.
The question that kept rattling around in my head isn't abstract: when should I — a full-stack developer who deploys on Railway, thinks in milliseconds of response time, and last week optimized a Next.js app from 3 seconds down to 300ms — actually start worrying in practical terms?
I don't have the answer. But that's exactly why this post is worth writing.
Quantum computing is basically like having a locksmith who doesn't try every key one by one — instead, somehow tries all possible keys at the same time. And what today takes millions of years to brute-force could, with that logic, take hours.
Once you see it that way, you understand why cryptographers get nervous.
The Quantum Computing Timeline for Web Developers: The Real State of Things in 2025
First important clarification: I'm not talking about something that happens tomorrow. The quantum computing that exists today is noisy, unstable, and doesn't scale well. IBM's and Google's machines have tens or hundreds of "real" qubits, but with error rates that make most cryptographically relevant algorithms impossible to run.
To break RSA-2048 — the standard protecting a huge chunk of HTTPS today — you'd need approximately 4,000 stable logical qubits. Logical qubits are different from physical ones; you need a lot of physical qubits to make one reliable logical qubit because of error correction overhead.
Right now, depending on who you ask, we're somewhere between 10 and 20 years away from machines with that kind of capability. Some say 15 years. Others say we never get there. Others say nation-state actors are already doing things we can't see.
That range of uncertainty is exactly the problem.
What That HN Post Made Me Understand
The thread I mentioned revolved around something called "harvest now, decrypt later" (HNDL). The idea: even if you can't break the encryption today, you can intercept encrypted traffic right now and store it for when you eventually have the quantum capacity to decrypt it in the future.
That changes the equation dramatically. If someone is capturing your 2025 HTTPS traffic to decrypt it in 2035, the "10 to 20 years" timeline becomes right now.
Who does that? Primarily nation-state actors. Does that matter to my Next.js recipe API? Probably not. Does it matter to a healthcare system, defense communications, or long-term financial transactions? Absolutely yes.
So the first practical answer is: it depends on what you're building.
What You Should Actually Know About Post-Quantum Cryptography as a Full-Stack Dev
Here's what I learned trying to understand the thread without a PhD in physics:
NIST Already Made Its Decisions
In 2024, NIST (National Institute of Standards and Technology) finalized the first post-quantum cryptography standards. The algorithms that made the cut are:
- ML-KEM (formerly CRYSTALS-Kyber): for key exchange
- ML-DSA (formerly CRYSTALS-Dilithium): for digital signatures
- SLH-DSA (formerly SPHINCS+): for digital signatures
This matters because the standardization work is done. We're not waiting for mathematicians to agree — they already did.
TLS 1.3 Is Already Getting Ready
Chrome, Firefox, and some servers are already experimenting with hybrid key exchange — combining the classical algorithm (X25519) with a post-quantum one (ML-KEM) in the same handshake. If one fails, the other keeps working. If the quantum algorithm turns out to have undiscovered vulnerabilities, the classical one has your back.
As a web dev, this will probably reach you transparently via updates to OpenSSL, nginx, or Node.js. You don't have to do anything... yet.
What You DO Need to Think About Actively
// This is what MANY projects do today
// and what could be problematic on a post-quantum horizon
// ❌ Algorithms that will eventually be vulnerable
const jwt = sign(payload, secret, { algorithm: 'RS256' }) // RSA
const encrypted = crypto.publicEncrypt(rsaPublicKey, data) // RSA
// ✅ Symmetric algorithms — these are relatively fine
// AES-256 remains secure in a quantum world
// (Grover's algorithm weakens it but doesn't break it — 256 bits -> 128 effective bits)
const hash = createHash('sha256').update(data).digest('hex') // OK for now
const cipher = createCipheriv('aes-256-gcm', key, iv) // More secure
// 🤔 The real question: do your secrets/tokens need to last decades?
// If a JWT expires in 1 hour, post-quantum risk is basically zero
// If you're signing contracts that need to be valid in 2040...
// that's where you need to think differently
The practical takeaway I got: risk scales with the lifetime of what you're signing or encrypting. A 15-minute access token and a legal document signing certificate carry completely different risks.
The Most Common Framing Errors When Reading About Quantum Computing
Here's what I think most popular posts on this topic get wrong — including probably this one:
Error 1: Confusing "quantum advantage" with "quantum supremacy" with "cryptographically relevant"
When Google or IBM announce a quantum milestone, the media frames it as "they can now break encryption." Almost never. Quantum advantage means they solved some specific problem faster than a classical computer. That specific problem is usually contrived and designed to make the quantum computer look good.
Cryptographically relevant quantum computing — the kind that actually matters — is a much higher bar.
Error 2: Thinking bcrypt or Argon2 Are Dead
Password hashing like bcrypt, scrypt, or Argon2 uses symmetric hash functions. Grover's algorithm (the one that applies quantum computing to search) effectively cuts their security in half — but Argon2 with modern parameters has plenty of margin. You don't need to change your authentication system right now.
Error 3: Ignoring It Completely Because "It's Far Away"
This is the error I find most dangerous for devs building long-term infrastructure. If you're building something that will handle sensitive data for decades, the timeline matters.
Think about everything I built during the pivot to software development in 2020 — the infrastructure you choose today can still be running in production in 2035. When you pick an authentication or encryption stack today, you're choosing for that time horizon too.
Error 4: Thinking This Is Only Ops' or the Sysadmin's Problem
In the 2025 ecosystem I described when I put together my stack for juanchi.dev, full-stack developers make architecture decisions that used to belong to ops. That includes which crypto library you use, how you sign tokens, what kind of certificates you request.
You can't fully delegate this.
Code: What to Audit in Your Project Today
// Quick post-quantum attack surface audit
// Check for these patterns in your codebase
// 1. ASYMMETRIC ALGORITHMS — the most vulnerable
// Search for: RSA, ECDH, ECDSA, DH
// Where do they show up?
import { generateKeyPairSync, createSign } from 'crypto'
// ❓ RSA — vulnerable to Shor's algorithm
// If this data needs to be valid past 2035, think hard about it
const { privateKey, publicKey } = generateKeyPairSync('rsa', {
modulusLength: 2048, // This eventually won't be enough
})
// ❓ ECDSA — also vulnerable, though more efficient today
const ecKey = generateKeyPairSync('ec', {
namedCurve: 'prime256v1',
})
// 2. SYMMETRIC ALGORITHMS — relatively OK
// AES-256, ChaCha20-Poly1305, SHA-256/384/512
// These need double the bits to be broken (Grover)
// but with 256 bits you have plenty of headroom
import { randomBytes, createCipheriv, createDecipheriv } from 'crypto'
const encryptLocalData = (data: Buffer, key: Buffer): Buffer => {
// AES-256-GCM — this remains secure post-quantum
const iv = randomBytes(16)
const cipher = createCipheriv('aes-256-gcm', key, iv)
const encrypted = Buffer.concat([
iv,
cipher.update(data),
cipher.final(),
cipher.getAuthTag() // The auth tag matters
])
return encrypted
}
// 3. JWT — the most common practical case
// Short answer: if it expires in hours, don't sweat it
// If you're signing something permanent with JWT... question the design
const evaluateJWTRisk = (expiresInSeconds: number): string => {
const years = expiresInSeconds / (365 * 24 * 3600)
if (years < 1) return 'Post-quantum risk is practically zero'
if (years < 5) return 'Low risk, monitor the timeline'
if (years < 15) return 'Moderate risk, consider migration'
return 'High risk — redesign this component'
}
console.log(evaluateJWTRisk(3600)) // "Post-quantum risk is practically zero"
console.log(evaluateJWTRisk(10 * 365 * 24 * 3600)) // "Moderate risk..."
The evaluateJWTRisk function is an obvious simplification, but it captures the central point: the lifetime of what you're signing is the most important variable.
When you define the TypeScript patterns you use in production, including explicit types for security context — how long a token lives, what sensitivity level the data carries — is exactly the kind of design that helps when you have to audit this stuff down the road.
What to Actually Do Today (Without Panicking)
This is my personal, honest list, no risk inflation:
Right now (regardless of project type):
- Use TLS 1.3 — it already implements security improvements and will receive post-quantum updates
- Prefer AES-256 over AES-128 for symmetric encryption
- Keep your dependencies updated — the post-quantum migration will arrive via library updates
- Don't roll your own crypto — seriously, never, quantum or not
In the next 1-2 years (if you handle sensitive long-lived data):
- Inventory what in your system uses asymmetric cryptography and how long that data needs to live
- Start reading about the libraries that will adopt NIST algorithms — Open Quantum Safe already has implementations
- Consider designing for "crypto-agility": making it so your system can swap algorithms without a full rewrite
For the stack I'd choose in 2025:
Today I'd choose actively maintained libraries from organizations that already have documented post-quantum migration plans. Node.js and OpenSSL are on that path. It's one more criterion to add to the evaluation.
FAQ: Quantum Computing and Web Development
Will HTTPS stop being secure because of quantum computing?
Not in the short term, and probably not all at once. TLS is already being updated with post-quantum algorithms (hybrid key exchange in TLS 1.3). The browser you're using today is already receiving these updates gradually. What IS a real threat is the "harvest now, decrypt later" attack for highly sensitive data — but that applies to nation-state actors, not average web traffic.
Do I need to change my app's login/password system?
Not urgently. bcrypt, Argon2, and scrypt use symmetric hash functions that are far more resistant to quantum computing than asymmetric algorithms. Argon2id with modern parameters has enough security margin. The recommendation is to stick with current best practices and stay up to date.
Are JWTs becoming obsolete?
Depends on how you use them. If you're signing access tokens that expire in 15 minutes or an hour, the risk is practically zero — by the time relevant quantum computing exists, those tokens have been expired for years. If you're using JWTs to sign something that needs to be valid for years (documents, contracts), that's where you need to start thinking about alternatives.
What's "post-quantum cryptography" and how is it different from "quantum cryptography"?
Post-quantum cryptography (PQC) consists of classical algorithms — running on normal computers — designed to be resistant to attacks from quantum computers. Quantum cryptography (QKD, quantum key distribution) uses quantum principles for the communication itself. As a web dev, what you care about is PQC — it's what you'll actually implement in your stack. QKD requires specialized hardware and is an entirely different field.
When should I start using post-quantum libraries in production?
For most web applications: when they arrive via updates to your existing dependencies, which is probably what's going to happen anyway. Node.js, OpenSSL, and TLS providers will implement the NIST standards gradually. If you handle critical long-lived data, it's worth exploring Open Quantum Safe today and starting to test. For everyone else, stay updated and don't panic.
Does quantum computing affect blockchain and crypto too?
Yes, significantly. Cryptocurrencies use ECDSA to sign transactions — vulnerable to Shor's algorithm. Bitcoin and Ethereum would have to migrate their signing systems before quantum computing becomes relevant. It's one of the most active debates in those communities. Fun fact: active wallets are more exposed than inactive ones, because active wallets expose the public key with every transaction.
On Honest Ignorance as a Valid Position
When I started writing this post I wasn't sure what I'd conclude. I still don't know if in 10 years we'll be re-encrypting the entire internet or if quantum computing keeps being a promise that never quite arrives.
What I did come away with:
The timeline matters more than the topic itself. It's not a binary "worry or don't worry." It's a function of your data's lifetime and how sensitive it is.
The industry is already moving. NIST finalized standards. TLS is already experimenting with hybrid algorithms. You won't have to do everything by hand — it'll come through the ecosystem.
Crypto-agility is the best investment. Designing your systems so they can swap algorithms without a total rewrite is good practice with or without quantum computing. The history of cryptography is the history of algorithms getting broken and replaced.
For most projects: stay updated and don't panic. If your app handles user credentials that expire, normal session tokens, and data that doesn't need to be valid for decades — the risk today is low. Apply current best practices.
What I still have to do is keep reading. The HN thread that triggered all this had responses from people with decades of cryptography experience who couldn't agree. That tells me epistemic humility is the right posture here.
If you're the kind of developer who, like me, comes from diagnosing networks at 11pm in a dingy server room or from taking down a production server with rm -rf in your first week on the job — you know that the best preparation for big problems isn't panicking when they show up. It's building systems that can adapt.
That applies here too.
Top comments (0)