Every month there's a new "unified AI API" — one SDK to rule them all. We looked at all of them. We built something different. Here's why.
The wrapper problem
API wrappers are convenient until they're not. You're still depending on the same 3-4 providers. If OpenAI has an outage, your wrapper goes down with it. If Anthropic raises prices, you eat it. If a provider decides your use case violates their ToS, you're out. You traded one lock-in for a slightly more polished version of the same lock-in.
We wanted something that actually couldn't be shut down or deplatformed. That meant going peer-to-peer.
What we built
Antseed is a P2P AI services network. Think TCP/IP for AI inference, a protocol not a platform. You run a local daemon that acts as a proxy on localhost. Your apps talk to localhost. The protocol routes to whoever can serve the request best, based on price, latency, and reputation.
Providers can be anyone: a gamer with a spare GPU, a dev with a Mac Mini, a dedicated inference farm, or a TEE node for privacy-sensitive workloads. They register, set their price, and compete on merit.
Why this matters more than another wrapper
A centralized router can ban you, throttle you, or just go down. A protocol can't. When we route a request there's no single server that can fail. If a provider drops off the network reroutes automatically.
The economics are also different. In a real marketplace with competition, inference prices drop to actual cost. No margin stacking from middlemen.
Where we are
Phase 1 is live: commodity inference, price/latency routing, automatic failover. We're running on our own network (dogfooding it hard). Phase 2 is differentiated services, providers with specialized capabilities. Phase 3 is agent-to-agent commerce, machines hiring machines.
If you're building on AI infrastructure and tired of being one ToS change away from a bad day, check out antseed.com. We're early but the protocol is real.
Happy to answer questions about the routing logic, provider reputation system, or TEE integration in the comments.
Top comments (0)