DEV Community

Khaja Hussain
Khaja Hussain

Posted on

I Compared TOON vs Minified JSON Using OpenAI’s Tokenizer

Recently I noticed a lot of developers talking about TOON:

TOON GitHub repository

The idea behind TOON is interesting. Instead of sending traditional JSON, TOON tries to reduce token usage and make data more compact for LLMs.

Since token costs are becoming a real concern for AI products, I wanted to test it myself.

Not theoretically.
Not with benchmarks from slides.
Just a simple real-world comparison.

I used OpenAI’s tokenizer tool and compared:

TOON format
Minified JSON

For the conversion process, I used:

JSON to TOON Converter
JSON Minifier

Minified JSON:

{"user":{"id":1001,"name":"Khaja","email":"khaja@example.com","roles":["admin","developer"],"settings":{"theme":"dark","notifications":true}}}

TOON version:

user:
id: 1001
name: Khaja
email: khaja@example.com
roles[2]: admin,developer
settings:
theme: dark
notifications: true
`

TOON Result: Tokens: 37
Minified JSON Result: Tokens: 38


That means TOON saved only 1 token in this example.

Honestly, that surprised me.

So… Is TOON Useful?

I think the answer is: yes, but with nuance.

TOON is not “bad.”
In fact, I actually like the direction.

It makes developers think seriously about:

token efficiency
AI-friendly data formats
prompt optimization
serialization overhead

Those are important conversations.

But after testing it, I’m not convinced that TOON alone will dramatically reduce costs for most companies.

In many cases:

removing whitespace already gives huge savings
gzip/brotli compression already works extremely well
AI models are already heavily trained on JSON structures

So the practical gains may be smaller than the hype suggests.

But Small Savings Can Still Matter at Scale

Here’s the interesting part.

Even tiny optimizations matter when companies process millions of requests.

Imagine:

10 million API calls
large AI prompts
multiple agents
long conversation histories

Saving even 1–2% tokens at scale could potentially save hundreds or thousands of dollars over time.

So I do understand why people are excited about TOON.

The Bigger Challenge: Ecosystem

Personally, I think TOON’s biggest challenge is not token count.

It’s ecosystem adoption.

JSON already has:

mature tooling
validators
parsers
database support
IDE integrations
API ecosystem dominance

Replacing that is extremely difficult.

In real production systems, compatibility usually matters more than tiny syntax improvements.

My Take

After testing both formats, my conclusion is:

TOON introduces interesting ideas
token savings appear modest in smaller examples
the ecosystem challenge is massive
but the conversation around AI-native serialization formats is valuable

I don’t think JSON is disappearing anytime soon.

But I do think experiments like TOON push the industry forward.

And honestly, that’s a good thing.

Would love to hear what other developers think or any suggestion what to compare next.

Top comments (0)