I Wrote a JWT Verifier in 150 Lines. You Probably Shouldn't — But Here's What I Learned.
A zero-dependency TypeScript CLI that decodes and verifies JWTs using only Node's built-in
cryptomodule. HS256/384/512, RS256/384/512, ES256/384, EdDSA. 46 tests. The whole verifier is ~150 lines. The article is what I learned by refusing to reach forjose.
📦 GitHub: https://github.com/sen-ltd/jwt-inspect
The recurring dev chore
If you work on anything that talks to a backend, you look at JWTs constantly. They come out of localStorage in the browser, out of Authorization: Bearer … headers in network tabs, out of .env files during local development, out of curl responses when you're debugging a third-party API. The standard answer to "what's in this token?" is https://jwt.io. It's fine. It's also a website, which means:
- Opening a new tab and tabbing away from your terminal
- Pasting a production-ish credential into a form field served by someone else's JavaScript
- Waiting for the page to paint
- Clicking into the payload box to read a date
- Not being able to pipe it into
jqor diff it against another token
A CLI fixes every one of those. You type jwt-inspect $TOKEN, you get colorized header + payload + computed expiry, and you can hand the same tool --verify --secret "$SECRET" to prove the signature matches. That's the whole pitch.
The secondary pitch is that I didn't want to depend on jsonwebtoken or jose. Not because those are bad — they're fine; jose is excellent — but because I wanted to know what was inside them, and "read the source" is always less effective for me than "rewrite it small enough to fit in your head." This article is the rewrite. It is ~150 lines of real, tested, production-quality-but-not-production-scoped TypeScript, and by the end of it I understood JOSE signing in a way I absolutely did not by reading specs.
The design in one paragraph
jwt-inspect <token> splits on ., base64url-decodes each segment, pretty-prints the header and payload as colorized JSON, computes exp/iat/nbf deltas against the wall clock, and exits. If you pass --verify --secret s (or --public-key file.pem) it runs the signing input — the string header.payload exactly as it appears in the token — through Node's crypto module and compares the result to the signature segment. The algorithm dispatch is a dictionary. The alg: none case is a hard throw. There is no allowlist confusion, no JWK rotation, no JWE, no clock skew tolerance, no hooks for your custom curve. This is on purpose. The project is a teaching example and a daily tool; for production, use jose.
base64url, from scratch
The first surprise when you decode JWTs by hand is that the segments are not base64. They're base64url (RFC 7515 §2, Appendix C), which differs in three specific ways:
-
+becomes- -
/becomes_ - trailing
=padding is stripped
Node's Buffer doesn't ship a "base64url" encoder until Node 16, and even there I wanted to write it out to see it:
// src/base64url.ts
export function base64urlEncode(input: Uint8Array | string): string {
const buf =
typeof input === 'string' ? Buffer.from(input, 'utf8') : Buffer.from(input);
return buf
.toString('base64')
.replace(/\+/g, '-')
.replace(/\//g, '_')
.replace(/=+$/, '');
}
export function base64urlDecode(input: string): Buffer {
if (!/^[A-Za-z0-9_-]*$/.test(input)) {
throw new Error('invalid base64url: contains non-base64url characters');
}
const pad = input.length % 4;
const padded =
pad === 0 ? input : pad === 2 ? input + '==' : input + '=';
const b64 = padded.replace(/-/g, '+').replace(/_/g, '/');
return Buffer.from(b64, 'base64');
}
The subtlety on decode is the padding. Because encoders strip =, the input length is one of 4k, 4k+2, or 4k+3 (never 4k+1, that's an illegal byte count). On decode you have to add those = back before feeding to Buffer.from(…, 'base64'), because Node's strict mode of base64 rejects unpadded input. The regex guards against any character outside the base64url alphabet, which catches a common class of "I accidentally included a newline or a space" bugs early.
Vitest round-trip tests for all four byte-count cases live in tests/base64url.test.ts — including the edge case where the byte is 0xfb 0xff (which hits both the +→- and /→_ replacements).
What the signing input actually is
This one caught me. A JWT has three segments joined by dots: H.P.S. I naively assumed the signature S was a signature over the concatenated decoded bytes of the header and payload JSON. It isn't. The signing input is the ASCII string H.P — the base64url-encoded header, a literal dot, the base64url-encoded payload — fed to the signing algorithm as bytes of that string. The signature is then base64url-encoded and becomes S.
This matters because it means you do not re-encode the JSON before verifying. You use the exact bytes you received. If you tried to be clever and re-serialize the header through JSON.stringify after parsing it, you'd corrupt the signing input the moment any encoder reordered keys or changed whitespace, and the signature would never match. The decoder in this project returns both the parsed JSON and the exact original string pair, for this reason:
// src/decoder.ts (excerpt)
return {
header,
payload,
signature: s,
signatureBytes: s === '' ? Buffer.alloc(0) : base64urlDecode(s),
signingInput: `${h}.${p}`, // <-- bytes we hand to the verifier
raw: trimmed,
};
signingInput is the string of base64url-encoded header and payload, exactly as received. Verification uses Buffer.from(signingInput, 'ascii').
alg: none is the only part that really matters
If you remember one thing about JWTs, it should be this: an attacker who can hand your server a token can change the declared alg. They can write {"alg":"none"}, strip the signature, and ship the token. A library that trusts the token header's declaration of its own algorithm — "oh, it says none, I guess I won't verify" — will cheerfully accept it. This actually happened, to real libraries, repeatedly, for years. It's CVE after CVE. It's why the JOSE specs say, over and over, never let the token tell you what algorithm you expected.
So the verifier has two defenses. First, none is refused unconditionally at verify time:
// src/verifier.ts (excerpt)
export function verify(token: DecodedJwt, options: VerifyOptions): boolean {
const alg = token.header.alg;
if (alg === 'none' || alg === 'None' || alg === 'NONE') {
throw new VerifyError(
'alg "none" is refused on verify (classic JWT bypass)',
'alg_none',
);
}
const allowed = options.allowedAlgs;
if (allowed && allowed.length > 0 && !allowed.includes(alg as VerifyAlg)) {
throw new VerifyError(
`alg "${alg}" is not in the allowed list [${allowed.join(', ')}]`,
'alg_not_allowed',
);
}
// ...dispatch on alg family...
}
Second, --alg RS256 on the CLI sets allowedAlgs: ['RS256'], which rejects any token that doesn't honestly claim that algorithm. Why does this matter? Because of the HS/RS alg-confusion attack: the server expects RS256 and holds an RSA public key. The attacker takes that public key, uses it as the HMAC secret, signs a modified token with HS256, and ships the result. If the server runs "whatever alg the header says" through a verifier that routes HS256 to HMAC and happens to use the RSA public key bytes as the HMAC key input, the signature matches. The defense is: pin the expected alg in your code, not in the token. --alg RS256 does exactly that. The vitest suite includes a test that tries the alg-confusion pattern and confirms it gets rejected.
Node's crypto covers 95% of real JWTs
Here's the quietly delightful thing I discovered: you do not need a JWT library to verify JWTs. Node's crypto module already speaks every algorithm in the JOSE suite — it just expects them in slightly different shapes. The dispatch table is tiny:
// src/verifier.ts (excerpt)
const HMAC_NAMES: Record<string, string> = {
HS256: 'sha256', HS384: 'sha384', HS512: 'sha512',
};
const RSA_NAMES: Record<string, string> = {
RS256: 'RSA-SHA256', RS384: 'RSA-SHA384', RS512: 'RSA-SHA512',
};
// HMAC: createHmac, digest, timingSafeEqual
const mac = createHmac(HMAC_NAMES[alg], secret).update(signingInput).digest();
return mac.length === token.signatureBytes.length &&
timingSafeEqual(mac, token.signatureBytes);
// RSA PKCS#1 v1.5: createVerify with the OpenSSL name
const v = createVerify(RSA_NAMES[alg]);
v.update(signingInput); v.end();
return v.verify(key, token.signatureBytes);
// ECDSA: createVerify, but pass dsaEncoding: 'ieee-p1363'
const v = createVerify('SHA256');
v.update(signingInput); v.end();
return v.verify({ key, dsaEncoding: 'ieee-p1363' }, token.signatureBytes);
// EdDSA: one-shot verify(null, data, key, signature)
return edVerify(null, signingInput, key, token.signatureBytes);
Four different shapes, four different Node APIs, and all of it is already there. The one place you have to earn your understanding is ECDSA. OpenSSL (and therefore Node's default createVerify path) expects ECDSA signatures in ASN.1 DER encoding. JOSE encodes them as the raw r || s concatenation — 64 bytes for P-256, 96 bytes for P-384, no ASN.1 wrapper. Early versions of this project did the DER conversion by hand, which was hilariously error-prone. Then I learned about dsaEncoding: 'ieee-p1363', which tells Node "the signature is in JOSE's raw shape, do the conversion yourself." One option, fifty lines of ASN.1 code deleted.
The HMAC comparison uses crypto.timingSafeEqual, not === or Buffer.equals. This matters: a naive byte-by-byte comparison leaks timing information that an attacker can use to learn the correct MAC one byte at a time. timingSafeEqual runs in constant time and, as a bonus, throws if the lengths differ, which is why the code checks mac.length === token.signatureBytes.length explicitly first — you get false for wrong-length signatures instead of a throw.
The test that convinced me it works
The test I trust most is the one that generates a real RSA keypair, signs a JWT with Node's crypto.createSign, and hands the result to my verifier:
// tests/verifier.test.ts (excerpt)
const { publicKey, privateKey } = generateKeyPairSync('rsa', { modulusLength: 2048 });
it('verifies an RS256 JWT end-to-end', () => {
const header = { alg: 'RS256', typ: 'JWT' };
const payload = { sub: 'rsa-user', iat: 1_700_000_000 };
const h = base64urlEncode(JSON.stringify(header));
const p = base64urlEncode(JSON.stringify(payload));
const signingInput = `${h}.${p}`;
const signer = createSign('RSA-SHA256');
signer.update(signingInput);
signer.end();
const sig = signer.sign(privateKey);
const token = `${signingInput}.${base64urlEncode(sig)}`;
const pem = publicKey.export({ type: 'spki', format: 'pem' }) as string;
expect(verify(decode(token), { publicKey: pem })).toBe(true);
});
No fixtures, no pre-baked tokens, no leaks. Every test run generates a fresh keypair, mints a token, verifies it, and asserts. The test adjacent to it flips a single byte in the signature segment and asserts the verifier returns false. Same pattern for ES256 with P-256 and for EdDSA with Ed25519. If any of those break, something is wrong with either my verifier or Node itself, and both outcomes are worth knowing about.
Total test count: 46 across five files. They run in about 800 ms locally, 1.1 s inside the Docker builder stage. No network, no database, no sockets.
Tradeoffs
Scope I deliberately did not implement:
- JWE (RFC 7516): encrypted tokens. Out of scope because you can't inspect encrypted things without the key, and at that point you have a different problem.
-
JWKS rotation:
josehascreateRemoteJWKSetthat fetches and caches a JWKS URI, handles key rollover, deals withkid-based lookup. Real production systems need this. This tool takes one PEM. -
Clock skew tolerance:
joselets you say "accept tokens that expired up to 30 seconds ago, because our servers' clocks drift." This tool reports the raw delta and lets you decide. -
Custom curves / RSA-PSS / RSA-OAEP / A128CBC-HS256: all doable with Node's
crypto, all not in my dispatch table. The 95% line was a deliberate cut. -
Clipboard on Linux: requires either
xclip,xsel, orwl-pasteon PATH. macOSpbpasteis preinstalled. Windows uses PowerShellGet-Clipboard. This is the least portable bit of the tool and it's documented as optional.
If any of those matter to your use case, you need jose, not this.
Try it in 30 seconds
git clone https://github.com/sen-ltd/jwt-inspect.git
cd jwt-inspect
docker build -t jwt-inspect .
# canonical jwt.io example, HS256 with secret "your-256-bit-secret"
TOKEN="eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c"
docker run --rm jwt-inspect "$TOKEN"
docker run --rm jwt-inspect "$TOKEN" --verify --secret "your-256-bit-secret"
docker run --rm jwt-inspect "$TOKEN" --format json | jq .
Then alias it:
alias jwi='docker run --rm jwt-inspect'
jwi $(pbpaste)
Takeaway
The thing I'll remember from this is that the JWT "standard" is actually pretty small. A JWT is three base64url segments. The signing input is the ASCII bytes of header.payload. Node's crypto module already speaks every algorithm you're likely to meet. The only genuinely tricky parts are (a) don't trust the header's self-declaration of its algorithm, and (b) ECDSA signatures come in two different shapes and you need to tell OpenSSL which one. Everything else is dispatching on a short lookup table.
Use jose in production. Write jwt-inspect in an afternoon when you want to stop copy-pasting tokens into a web form, and read the source of jose after. You'll know what to look at.

Top comments (1)
Great writeup — the "signing input is the ASCII string H.P, not the decoded JSON" thing catches everyone first time. the re-serialization trap is so real, spent like 2 hours debugging that before i figured out what was happening.
Your point about alg: none is the one i keep coming back too. The root problem is that the JWT carrys metadata about how to verify itself, and the verifier is just expected to trust that metadata from an untrusted source. Your --alg RS256 pin is the right fix on the verification side.
Ive been working on the opposite side of this problem — a daemon that signs JWTs for AI coding agents (GCP service accounts right now). The design desicion we landed on was making the algorithm vault-authoritative: the algorithm gets stored alongside the private key at import time, and theres no parameter to set or override it at signing time. The agent requests a token, daemon reads the algo from its own encrypted storage, signs, exchanges for an access token at googles oauth endpoint, and returns only the access token. Agent never sees the JWT or the private key.
Your observation that Nodes crypto covers 95% of real JWTs lines up with what i found on the Rust side — rings RSA_PKCS1_SHA256 and ECDSA_P256_SHA256_FIXED_SIGNING basically cover the algorithms that GCP actualy uses in practice. The dispatch table is just as small lol.
The ECDSA encoding gotcha deserves its own post honestly — FIXED_SIGNING in ring and dsaEncoding: 'ieee-p1363' in Node are solving the exact same problem (JWS wants raw R||S, OpenSSL wants ASN.1 DER). i spent way too much time debugging that before discovering the right constant. would of saved a week if id known earlier.
Solid tool for the inspection usecase. The no-dependencies constraint is a really good forcing function for actualy learning whats going on under the hood.