JSON is everywhere. APIs, config files, databases, logs. Yet developers still waste hours debugging malformed JSON. Missing commas, trailing commas, single quotes instead of double quotes — these tiny errors break everything.
Here's everything you need to know about JSON, from basics to advanced tooling.
JSON Basics Everyone Gets Wrong
Trailing Commas Are Invalid
// WRONG — trailing comma after "blue"
{
"colors": ["red", "green", "blue",]
}
// CORRECT
{
"colors": ["red", "green", "blue"]
}
JavaScript allows trailing commas. JSON does not. This catches everyone at least once.
Only Double Quotes
// WRONG — single quotes
{'name': 'John'}
// CORRECT — double quotes only
{"name": "John"}
No Comments
// WRONG — JSON has no comments
{
"port": 3000 // development port
}
// CORRECT — use a separate field or remove
{
"port": 3000,
"_port_note": "development port"
}
If you need comments, use JSONC (JSON with Comments) or JSON5. VS Code supports JSONC for settings files.
Numbers Don't Have Quotes
// WRONG
{"age": "25", "price": "9.99"}
// CORRECT — numbers are unquoted
{"age": 25, "price": 9.99}
This matters for APIs. If your backend expects a number and gets a string, things break silently.
Command-Line JSON Tools
jq — The JSON Swiss Army Knife
# Pretty print
echo '{"name":"John","age":30}' | jq '.'
# Extract a field
echo '{"user":{"name":"John","email":"j@test.com"}}' | jq '.user.email'
# Output: "j@test.com"
# Filter arrays
echo '[{"name":"a","active":true},{"name":"b","active":false}]' | \
jq '.[] | select(.active == true)'
# Transform
echo '{"first":"John","last":"Doe"}' | \
jq '{full_name: (.first + " " + .last)}'
python -m json.tool
Already installed on most systems:
echo '{"compact":"json"}' | python -m json.tool
Node.js one-liner
echo '{"a":1}' | node -e "process.stdin.on('data',d=>console.log(JSON.stringify(JSON.parse(d),null,2)))"
Validating JSON
In the Terminal
# jq returns exit code 1 for invalid JSON
echo '{"broken":}' | jq '.' 2>/dev/null || echo "Invalid JSON"
# Python approach
echo '{"test": true}' | python -c "import sys,json; json.load(sys.stdin); print('Valid')"
In Code
function isValidJSON(str: string): boolean {
try {
JSON.parse(str);
return true;
} catch {
return false;
}
}
With Schema Validation (Zod)
import { z } from 'zod';
const UserSchema = z.object({
name: z.string(),
email: z.string().email(),
age: z.number().min(0).max(150),
});
// Throws descriptive error if invalid
const user = UserSchema.parse(JSON.parse(jsonString));
Zod gives you TypeScript types AND runtime validation from the same schema.
Common JSON Mistakes in APIs
Mistake 1: Inconsistent Naming
// Don't mix conventions
{
"firstName": "John",
"last_name": "Doe",
"Email-Address": "john@test.com"
}
// Pick one and stick with it
{
"first_name": "John",
"last_name": "Doe",
"email_address": "john@test.com"
}
Mistake 2: Dates as Random Strings
// Ambiguous — is this March 7 or July 3?
{"date": "3/7/2026"}
// Use ISO 8601 — always
{"date": "2026-03-07T00:00:00Z"}
Mistake 3: Null vs. Missing vs. Empty String
// These all mean different things
{"name": null} // explicitly no value
{"name": ""} // empty value
{} // field doesn't exist
Document which one your API uses. Inconsistency here causes bugs that are painful to debug.
Extracting JSON from Unstructured Text
Sometimes you have data trapped in plain text — receipts, emails, logs — and you need it as JSON. Regex works for simple patterns but breaks on real-world messy text.
StructureAI does this with one API call:
curl -X POST https://api-service-wine.vercel.app/api/extract \
-H "Content-Type: application/json" \
-H "X-API-Key: YOUR_KEY" \
-d '{
"text": "Meeting with Sarah tomorrow at 3pm in Conference Room B to discuss Q1 budget. Bring the revenue projections.",
"schema": "custom",
"custom_fields": ["participants", "datetime", "location", "topic", "action_items"]
}'
Returns:
{
"participants": ["Sarah"],
"datetime": "tomorrow at 3pm",
"location": "Conference Room B",
"topic": "Q1 budget discussion",
"action_items": ["Bring revenue projections"]
}
No regex. No parsing. Just clean JSON from messy text.
JSON Performance Tips
Streaming Large Files
Don't load a 2GB JSON file into memory:
# Stream with jq
cat huge.json | jq -c '.items[]' | while read -r item; do
echo "$item" | jq '.id'
done
Minify for Production
# Remove whitespace
cat config.json | jq -c '.' > config.min.json
NDJSON for Log Processing
Newline-delimited JSON (one object per line) is easier to process than arrays:
{"event":"login","user":"john","ts":"2026-03-07T10:00:00Z"}
{"event":"purchase","user":"john","amount":9.99,"ts":"2026-03-07T10:05:00Z"}
Process with:
cat events.ndjson | jq -c 'select(.event == "purchase") | .amount'
Quick Reference
| Need | Tool/Command |
|---|---|
| Pretty print | jq '.' |
| Validate | jq '.' > /dev/null |
| Extract field | jq '.field' |
| Filter array | `jq '.[] \ |
| Minify | {% raw %}jq -c '.'
|
| Schema validate | Zod, Joi, Ajv |
| Text to JSON | StructureAI API |
Master these tools and you'll never waste time on JSON formatting again.
Built by Avatrix LLC. Extract structured JSON from any text with StructureAI — $2 for 100 requests.
Top comments (0)