Benchmarking Agent Communication: JSON vs Binary Encoding
If you're building multi-agent systems, your agents are probably talking in JSON. And they're wasting 95% of their bandwidth doing it.
I ran benchmarks comparing standard JSON messaging against fixed-size binary encoding for agent-to-agent communication. The results changed how I build agent systems.
The Problem
A typical agent observation message in JSON:
{
"action": "observe",
"domain": "market",
"asset": "BTC",
"confidence": 0.85,
"timeframe": "present",
"source": "exchange_feed",
"metadata": {
"price": 70588,
"volume_24h": 28500000000,
"change_24h": -2.3
}
}
That's ~250 bytes for a simple observation. Complex messages with nested data, arrays, and multiple fields easily hit 5,000-50,000 bytes.
In a 10-agent system where each agent sends 1 message/second, that's 500KB/sec of JSON being serialized, transmitted, parsed, and garbage collected. Scale to 100 agents? 50MB/sec of redundant text.
The Alternative: Fixed-Size Binary
What if every message was exactly 2,556 bytes? Regardless of complexity?
from bytepack import encode
# Simple message: 2,556 bytes
result = encode({"action": "observe", "domain": "market"})
# Complex message: still 2,556 bytes
result = encode({
"action": "correlate",
"domain": "energy",
"asset": "crude_oil",
"confidence": "high",
"timeframe": "4h",
"related_domain": "geopolitics",
"signal_strength": 0.92,
})
Both encode to exactly 2,556 bytes. The encoding uses hyperdimensional computing — concepts are mapped to high-dimensional binary vectors, combined via bind/bundle/permute operations, and compressed to a fixed-size output.
Benchmark Setup
- Messages: 10,000 random structured observations across 9 domains
- Fields per message: 3-12 (randomized)
- Machine: 4 vCPU, 8GB RAM (standard cloud instance)
- Measured: size, throughput, accuracy, noise tolerance
Results
| Metric | JSON | Binary Encoding | Improvement |
|---|---|---|---|
| Avg message size | 12,400 bytes | 2,556 bytes | 4.9x smaller |
| Max message size | 48,200 bytes | 2,556 bytes | 18.9x smaller |
| Size variance | High (σ = 8,100) | Zero | Fixed size |
| Encode throughput | N/A | 2,614 msg/sec | — |
| Noise tolerance | 0% | 25% | Survives corruption |
| Decode accuracy | 100% | 97.2% | Acceptable |
The key insight: the compression ratio improves with message complexity. Simple messages get ~5x compression. Complex messages with nested objects get 15-20x compression. The encoding is fixed-size, so more complex input = better ratio.
Noise Tolerance
This was unexpected. Binary encoded messages tolerate 25% bit corruption and still decode correctly. You can flip 5,000 of the 20,000 payload bits and the message reconstructs.
This matters for real-world agent systems — unreliable networks, UDP transport, edge devices with spotty connectivity. JSON corrupted by a single bit is unreadable. Binary encoding degrades gracefully.
Fixed Size = Simpler Systems
When every message is exactly 2,556 bytes:
- Buffer allocation is trivial — no dynamic sizing
- Bandwidth planning is exact — N agents × rate × 2,556
- Transport-agnostic — works over HTTP, WebSocket, UDP, raw TCP
- Stream processing is simple — read 2,556 bytes = one complete message
Integration
The bytepack library wraps the encoding service with integrations for major frameworks:
pip install bytepack
# CrewAI
from bytepack.integrations.crewai import BinaryEncodeTool
agent = Agent(tools=[BinaryEncodeTool()])
# LangGraph
from bytepack.integrations.langgraph import encode_node
graph.add_node("compress", encode_node)
# AutoGen
from bytepack.integrations.autogen import register_bytepack_tools
register_bytepack_tools(agent)
Also available as an MCP tool (any Claude/GPT agent can call it) and an A2A agent (discoverable via standard agent card).
When to Use Binary Encoding
✅ Multi-agent systems with high message volume
✅ Real-time streaming between agents
✅ Bandwidth-constrained environments (edge, mobile, IoT)
✅ Cross-framework communication (crewAI ↔ LangGraph ↔ AutoGen)
✅ When you need transport-agnostic messaging
❌ Human-readable logging (use JSON for that, encode for wire)
❌ Messages that need to be inspected in transit
❌ Systems where 97.2% accuracy isn't sufficient
Try It
from bytepack import encode, health
# Check service
print(health()) # {"up": true, ...}
# Encode something
result = encode({"action": "observe", "domain": "market", "asset": "BTC"})
print(f"{result['s']} bytes") # 2556
The service is free and open. Source: github.com/wirepack/bytepack
Measured March 2026. Benchmarks may vary with payload complexity and network conditions.
Top comments (0)