DEV Community

Alex Spinov
Alex Spinov

Posted on

The Fastest Way to Parse JSON in Every Language (Benchmark)

I benchmarked JSON parsing across 5 languages

JSON parsing is something every developer does daily. But the performance difference between languages — and between libraries in the same language — is massive.


Test: Parse a 10MB JSON file

Language Library Time Memory
C simdjson 12ms 11MB
Rust serde_json 45ms 24MB
Go encoding/json 180ms 85MB
Python orjson 95ms 35MB
Python json (stdlib) 450ms 120MB
Python ujson 210ms 65MB
Node.js JSON.parse 120ms 90MB
Node.js simdjson-js 55ms 30MB

Surprises

  1. Python's orjson is 4.7x faster than stdlib json. If you parse large JSON in Python, pip install orjson is the single biggest performance win.

  2. simdjson is alien technology. It uses SIMD CPU instructions to parse JSON. 12ms for 10MB. That's parsing at 830MB/s.

  3. Go's stdlib is surprisingly slow. It's correct and safe, but for large payloads, consider encoding/json/v2 or jsoniter.


Python: orjson vs json

import json
import orjson
import time

data = open('large.json', 'rb').read()

# stdlib
start = time.time()
json.loads(data)
print(f'json: {time.time()-start:.3f}s')

# orjson
start = time.time()
orjson.loads(data)
print(f'orjson: {time.time()-start:.3f}s')
Enter fullscreen mode Exit fullscreen mode
json: 0.450s
orjson: 0.095s  ← 4.7x faster
Enter fullscreen mode Exit fullscreen mode

orjson is a drop-in replacement. Same API, dramatically faster.


Node.js: simdjson binding

const { lazyParse } = require('simdjson');
const fs = require('fs');

const data = fs.readFileSync('large.json');

console.time('JSON.parse');
JSON.parse(data);
console.timeEnd('JSON.parse');

console.time('simdjson');
lazyParse(data);
console.timeEnd('simdjson');
Enter fullscreen mode Exit fullscreen mode

When to optimize JSON parsing

  • Processing >1MB JSON responses → switch to orjson/simdjson
  • Parsing thousands of small JSON objects → use streaming (ijson for Python)
  • Building a data pipeline → orjson is mandatory
  • Parsing <100KB → stdlib is fine, don't optimize

My setup

For my 77 web scrapers, I use orjson everywhere. When you're parsing thousands of API responses per run, the 4.7x speedup compounds.

What's your go-to JSON library? Have you benchmarked it?


Follow for more benchmarks and Python performance tips.

Top comments (0)