JSON.parse() gives you an error at "position 42". Great. Position 42 of a 500-line config file. Let me just count characters manually, I guess.
I hit this so often that at some point I stopped counting and built a JSON Linter that shows errors with actual line numbers. But more on that in a second. First, let's talk about why JSON keeps breaking in the same ways.
It's Always the Same Five Things
Trailing commas. You delete the last field, forget the comma on the one above it. Or you copy from JavaScript, where trailing commas are fine. JSON disagrees.
{
"name": "Alice",
"age": 32,
}
Single quotes. A Python habit. Or a "I'll just type this real quick" habit. JSON wants double quotes, always.
Comments. JSON has no comments. None. This is honestly a spec failure. tsconfig.json and VS Code settings use JSONC precisely because people need comments in config files. But standard JSON? Nope.
{
// this breaks everything
"url": "https://api.example.com"
}
Unquoted keys. Valid JavaScript, invalid JSON. { name: "Alice" } fails.
Missing brackets. Usually from truncated logs or half-pasted API responses.
These five cover maybe 90% of all JSON parse errors I've ever seen.
So I Built a Fixer
The JSON Linter and Fixer does two things:
Paste broken JSON, hit "Fix" and it auto-repairs trailing commas, single quotes, comments, unquoted keys. It uses jsonrepair under the hood. It won't magically reconstruct corrupted data, but if the JSON is almost right (and it usually is), it'll clean it up.
Hit "Lint" and you get properly formatted output with syntax highlighting. Useful even when the JSON isn't broken, just minified into an unreadable blob.
{"users":[{"name":"Alice","age":32,"addresses":[{"city":"NYC","zip":"10001"}]},{"name":"Bob","age":25,"addresses":[{"city":"LA","zip":"90001"}]}]}
Nobody can read that. One click and it's formatted.
Everything runs client-side. No backend, nothing uploaded anywhere.
The Other Half: jq Without Installing Anything
Once the JSON is valid, you usually need to pull something out of it. "Give me all the user names." "Which items have status failed?" "Group these by category."
You could write a quick script. Or you could use jq, which does this kind of thing in one line. Except now you need to install jq, and you're on a machine where you can't, or you don't feel like it.
So the same site has a jq playground. Paste JSON, write a filter, see the result. Browser-only, same as the linter.
Some examples that come up all the time, given this data:
[
{ "name": "Alice", "age": 32, "role": "admin", "active": true },
{ "name": "Bob", "age": 25, "role": "user", "active": false },
{ "name": "Carol", "age": 28, "role": "admin", "active": true }
]
Pull all names:
map(.name)
Filter active admins over 30:
.[] | select(.role == "admin" and .age > 30)
Reshape into a different format:
map({ id: .name, isAdmin: (.role == "admin") })
Group by role and count:
group_by(.role) | map({ role: .[0].role, count: length })
That last one gives you [{"role": "admin", "count": 2}, {"role": "user", "count": 1}]. The kind of thing you'd normally open Python for.
Other one-liners I use constantly:
[.[] | .category] | unique # unique values from an array
max_by(.price) # most expensive item
[.users[].addresses[].city] | unique # flatten and dedupe nested arrays
You get the idea. Try it with your own data.
The Workflow
I keep the linter bookmarked. API returns garbage, paste, fix, lint. Need to dig into the response, switch to the jq tab. It handles maybe 80% of my "what's in this JSON?" moments without opening a terminal.
The whole thing is open source on GitHub if you want to look at the code or contribute.
Part of echoValue. Free, no signups, no tracking.
Top comments (0)