DEV Community

Bridge ACE
Bridge ACE

Posted on

The WebSocket Architecture That Makes Multi-Agent AI Actually Work

The WebSocket Architecture That Makes Multi-Agent AI Actually Work

Most multi-agent frameworks coordinate through files, function calls, or HTTP requests. Bridge ACE uses a dedicated WebSocket server as the primary communication bus between AI agents. Here is why — and how.

Why Not Files?

Claude Code coordinates its Agent Teams through JSON files in ~/.claude/teams/. Agent A writes a message to a JSON file. Agent B polls the directory for new files. This works for simple coordination but has problems:

  • Latency: Polling intervals add 1-5 seconds of delay per message
  • Scalability: 10 agents polling a directory = filesystem pressure
  • Ordering: No guaranteed message ordering across agents
  • Discovery: Agents discover each other through file naming conventions

Why Not HTTP?

You could run a REST API and have agents POST messages. But:

  • Still requires polling: Agents must repeatedly check for new messages
  • Connection overhead: Each request opens a new TCP connection
  • No push: The server cannot proactively notify agents

Why WebSocket

Bridge ACE runs a WebSocket server on port 9112. Every agent maintains a persistent connection. Messages are pushed instantly.

Agent A ──ws──┐
Agent B ──ws──┤── WebSocket Server (:9112)
Agent C ──ws──┘
              │
         HTTP API (:9111) ←── Browser UI
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • Sub-millisecond delivery: Messages arrive instantly, no polling
  • Bidirectional: Server pushes to agents, agents push to server
  • Persistent connection: No connection setup overhead
  • Broadcast: One message can reach all agents simultaneously
  • Presence: Server knows immediately when an agent disconnects

Implementation Details

The WebSocket server (websocket_server.py) handles:

  1. Agent registration: Agents connect and announce their identity
  2. Heartbeat: 30-second ping/pong to detect dead connections
  3. Message routing: Direct messages (agent-to-agent) and broadcast
  4. Channel separation: Control, work, and scope channels
  5. UI relay: Browser clients receive all messages for real-time display

The MCP server (bridge_mcp.py) wraps this in clean tool calls:

# Agent sends a message
bridge_send(to='backend', content='API endpoint ready', channel='work')

# Agent receives messages (pushed via WebSocket, no polling)
bridge_receive()

# Agent broadcasts to all
bridge_send(to='all', content='Build complete', channel='work')
Enter fullscreen mode Exit fullscreen mode

Real-Time Coordination Patterns

This architecture enables patterns that file-based systems cannot:

Reactive chains: Agent A finishes → immediately notifies Agent B → Agent B starts → notifies Agent C. Total latency: milliseconds.

Parallel consensus: Three agents analyze the same problem. Each posts their assessment. A coordinator agent reads all three and makes a decision. All within seconds.

Live monitoring: The Fleet Management UI receives every message in real-time. You watch your agents coordinate like a team chat.

Open Source

The entire WebSocket architecture is in the repo. No external message broker needed.

git clone https://github.com/Luanace-lab/bridge-ide.git
cd bridge-ide && ./install.sh && ./Backend/start_platform.sh
Enter fullscreen mode Exit fullscreen mode

Apache 2.0. Self-hosted.

GitHub: github.com/Luanace-lab/bridge-ide

Top comments (0)