Every OpenClaw installation has one core process running in the background: the Gateway. It's the engine your AI agent runs on — the WebSocket server, the routing layer, the session manager, the channel hub, and the control plane all rolled into one persistent process.
Most people set it up once and forget it exists. That's by design. But understanding how the Gateway actually works gives you real leverage: better config, faster debugging, more reliable deployments, and the ability to tune performance when it matters.
This is the deep dive I wish existed when I first set up OpenClaw.
What the Gateway Actually Is
The Gateway is a long-lived Node.js daemon that listens on a single multiplexed port (default: 18789). On that single port it handles:
- WebSocket control plane — the primary protocol for CLI, web UI, and mobile nodes
- HTTP APIs — OpenAI-compatible chat completions, Responses API, tools invoke
-
Control UI — the built-in web dashboard at
/openclaw - Webhooks — incoming HTTP hooks from Gmail, Stripe, or any custom source
- Channel connections — Slack, Discord, Telegram, WhatsApp, Signal, iMessage, and more
Everything flows through this one process. Your CLI talks to it. Your phone node connects to it. Your cron jobs run inside it. Your channels receive messages through it. The agent itself runs as a session managed by it.
Start it with:
openclaw gateway --port 18789
# Verbose mode — streams debug logs to stdio
openclaw gateway --port 18789 --verbose
# Force-kill any existing listener on the port before starting
openclaw gateway --force
Check health at any time:
openclaw gateway status
openclaw gateway status --deep
openclaw status
A healthy Gateway returns: Runtime: running and RPC probe: ok.
The Runtime Model
The Gateway operates with a clean separation between the control plane (how clients connect and issue commands) and the data plane (how messages and sessions flow).
At the core is the WebSocket protocol. All clients — CLI, web UI, mobile nodes, headless automation — connect over WebSocket and identify themselves at handshake time. The handshake flow:
- Gateway sends a challenge with a nonce
- Client signs the nonce with its device keypair
- Client sends a
connectrequest declaring its role, scopes, and capabilities - Gateway issues a
hello-okwith a device token if auth passes
Two types of clients connect:
- Operators — control plane clients (CLI, web UI, automation scripts). They issue commands and receive events.
-
Nodes — capability hosts (iOS/Android apps, headless node processes). They declare capabilities like
camera,screen,canvas,voice, and respond to commands from the Gateway.
Configuration: The Minimum You Need
The Gateway's config lives in ~/.openclaw/openclaw.json. The absolute minimum to get an agent running is:
{
"agent": { "workspace": "~/.openclaw/workspace" },
"channels": { "whatsapp": { "allowFrom": ["+15555550123"] } }
}
That's it. One workspace path and one channel allowlist. The Gateway fills in sensible defaults for everything else.
A more production-ready starter:
{
"identity": {
"name": "Clawd",
"theme": "helpful assistant",
"emoji": "🦞"
},
"agent": {
"workspace": "~/.openclaw/workspace",
"model": { "primary": "anthropic/claude-sonnet-4-5" }
},
"channels": {
"whatsapp": {
"allowFrom": ["+15555550123"],
"groups": { "*": { "requireMention": true } }
}
}
}
Hot Reload: No Restarts Required (Usually)
One of the Gateway's underrated features is hot config reload. When you edit openclaw.json and save, the Gateway detects the change and applies it — without dropping sessions or disconnecting channels.
The reload behavior is controlled by gateway.reload.mode:
-
off— no reload; changes require a manual restart -
hot— apply only changes that are safe to hot-apply -
restart— restart the full process on any change -
hybrid(default) — hot-apply safe changes, restart only when required
hybrid is the right default for most setups. The Gateway knows which config sections are hot-safe (routing rules, message formatting, agent model) and which require a restart (port changes, auth mode, channel connection strings).
Auth and Security
By default, the Gateway binds to loopback only (gateway.bind: "loopback") — meaning it's only reachable from the same machine. This is the safe default.
When you expose it to the network (for Tailscale, remote nodes, or multi-device setups), auth is required:
{
"gateway": {
"mode": "local",
"port": 18789,
"bind": "loopback",
"auth": {
"mode": "token",
"token": "your-gateway-token",
"allowTailscale": true
},
"tailscale": { "mode": "serve", "resetOnExit": false }
}
}
Multi-Gateway Setups
You can run multiple Gateways for different workspaces, teams, or projects. Each binds to a different port and has its own config.
A remote Gateway setup lets you run the Gateway on a server and connect from anywhere:
{
"gateway": {
"remote": {
"url": "ws://gateway.tailnet:18789",
"token": "remote-token"
}
}
}
This is how you get truly persistent AI agents that run on a VPS or home server — the agent lives remotely, but you control it from your laptop.
Cron, Sessions, and the Agent Runtime
Session Store
Sessions are persisted to disk. Each conversation gets its own session key. Session maintenance is configurable:
{
"session": {
"reset": { "mode": "daily", "atHour": 4, "idleMinutes": 60 },
"maintenance": {
"mode": "warn",
"pruneAfter": "30d",
"maxEntries": 500
}
}
}
Cron Scheduler
The Gateway includes a built-in cron scheduler. Jobs survive restarts:
{
"cron": {
"enabled": true,
"store": "~/.openclaw/cron/cron.json",
"maxConcurrentRuns": 2,
"sessionRetention": "24h"
}
}
Heartbeat
The heartbeat system fires on a schedule and keeps your agent proactive:
{
"agent": {
"heartbeat": {
"every": "30m",
"model": "anthropic/claude-sonnet-4-5",
"target": "last",
"to": "+15555550123",
"prompt": "HEARTBEAT"
}
}
}
Logging and Diagnostics
{
"logging": {
"level": "info",
"file": "/tmp/openclaw/openclaw.log",
"consoleLevel": "info",
"consoleStyle": "pretty",
"redactSensitive": "tools"
}
}
Live log tailing:
openclaw logs --follow
openclaw logs --follow --level debug
For deeper diagnostics: openclaw doctor
The OpenAI-Compatible HTTP API
The Gateway exposes an OpenAI-compatible HTTP API. Any tool or library that talks to OpenAI can talk to your OpenClaw agent instead:
POST http://localhost:18789/v1/chat/completions
Authorization: Bearer your-gateway-token
{
"model": "anthropic/claude-sonnet-4-5",
"messages": [{"role": "user", "content": "What's on my plate today?"}]
}
Putting It Together
The Gateway isn't just a server — it's the operating environment for your AI agent. Sessions, memory, channels, cron, webhooks, sub-agents, approval workflows — it all runs inside this one process.
Understanding how it works means you can:
- Debug connectivity issues without guessing
- Configure multi-device and remote setups correctly
- Tune session management for your usage pattern
- Use the HTTP API to integrate with existing tools
- Hot-reload config without losing sessions mid-conversation
Most people treat the Gateway as a black box. That's fine — it's designed to be set-and-forget. But when you need more, the full control plane is there.
Originally published at openclawplaybook.ai. Get The OpenClaw Playbook — $9.99
Top comments (0)