DEV Community

Cover image for Deploying OpenClaw on Ubuntu 26.04
Aashish Chaurasiya for Vultr

Posted on • Originally published at docs.vultr.com

Deploying OpenClaw on Ubuntu 26.04

OpenClaw is a self-hosted autonomous AI agent platform that connects to WhatsApp, Telegram, Slack, and Discord through a unified Gateway control plane. It maintains persistent memory across sessions and supports tool execution with any OpenAI-compatible model provider, including Vultr Serverless Inference. This guide deploys OpenClaw using the interactive setup wizard and exposes the control UI securely over HTTPS with Caddy. By the end, you'll have OpenClaw running with a live gateway and the control interface accessible at your domain.


Install Docker

The official Docker repository gives you the most current Docker Engine builds.

1. Install dependency packages:

$ sudo apt install apt-transport-https ca-certificates curl git -y
Enter fullscreen mode Exit fullscreen mode

2. Add Docker's GPG key:

$ sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
Enter fullscreen mode Exit fullscreen mode

3. Register the Docker repository:

$ echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
Enter fullscreen mode Exit fullscreen mode

4. Refresh APT and install Docker:

$ sudo apt update
$ sudo apt install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin -y
Enter fullscreen mode Exit fullscreen mode

5. Add your user to the Docker group:

$ sudo usermod -aG docker $USER
$ newgrp docker
Enter fullscreen mode Exit fullscreen mode

Set Up OpenClaw

1. Create the project directory:

$ mkdir -p ~/openclaw-assistant
$ cd ~/openclaw-assistant
Enter fullscreen mode Exit fullscreen mode

2. Clone the OpenClaw repository:

$ git clone https://github.com/openclaw/openclaw.git
$ cd openclaw
Enter fullscreen mode Exit fullscreen mode

3. Run the interactive setup wizard:

$ ./docker-setup.sh
Enter fullscreen mode Exit fullscreen mode

The wizard guides you through:

  • Model provider selection (Anthropic, OpenAI, Google, or Vultr Serverless Inference)
  • Channel configuration (Slack, Discord, Telegram, or WhatsApp)
  • Channel allowlists and skill setup
  • Gateway startup

4. Verify the gateway is running:

$ docker compose logs openclaw-gateway
Enter fullscreen mode Exit fullscreen mode

A line containing [gateway] listening on ws://0.0.0.0:18789 confirms the gateway is active.


Access the Control UI

Option A: SSH Tunnel (development)

1. Allow insecure local auth in the configuration:

$ nano ~/.openclaw/openclaw.json
Enter fullscreen mode Exit fullscreen mode

Find the gateway block and add:

"controlUi": {
    "allowInsecureAuth": true
}
Enter fullscreen mode Exit fullscreen mode

2. Restart the gateway:

$ docker compose restart openclaw-gateway
Enter fullscreen mode Exit fullscreen mode

3. Open an SSH tunnel from your local machine:

$ ssh -N -L 18789:127.0.0.1:18789 linuxuser@YOUR-SERVER-IP
Enter fullscreen mode Exit fullscreen mode

Open http://localhost:18789/?token=YOUR-TOKEN in a browser.


Option B: HTTPS with Caddy (production)

1. Install Caddy:

$ curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/gpg.key' | sudo gpg --dearmor -o /usr/share/keyrings/caddy-stable-archive-keyring.gpg
$ curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/debian.deb.txt' | sudo tee /etc/apt/sources.list.d/caddy-stable.list
$ sudo apt update
$ sudo apt install caddy -y
Enter fullscreen mode Exit fullscreen mode

2. Configure the Caddyfile:

$ sudo nano /etc/caddy/Caddyfile
Enter fullscreen mode Exit fullscreen mode
openclaw.example.com {
    reverse_proxy localhost:18789
}
Enter fullscreen mode Exit fullscreen mode

3. Open the firewall and reload Caddy:

$ sudo ufw allow 80/tcp
$ sudo ufw allow 443/tcp
$ sudo systemctl reload caddy
Enter fullscreen mode Exit fullscreen mode

Open https://openclaw.example.com/?token=YOUR-TOKEN in a browser.


Configure Vultr Serverless Inference

To use Vultr's OpenAI-compatible inference endpoint as the model backend:

1. Edit the OpenClaw configuration:

$ nano ~/.openclaw/openclaw.json
Enter fullscreen mode Exit fullscreen mode

Add the Vultr provider block under models.providers:

"vultr": {
    "baseUrl": "https://api.vultrinference.com/v1",
    "apiKey": "YOUR-VULTR-API-KEY",
    "api": "openai-completions",
    "models": [
        { "id": "moonshotai/Kimi-K2.5", "name": "Kimi-K2.5" }
    ]
}
Enter fullscreen mode Exit fullscreen mode

2. Restart the gateway:

$ docker compose restart openclaw-gateway
Enter fullscreen mode Exit fullscreen mode

3. Select the model from a connected channel:

/model vultr/moonshotai/Kimi-K2.5
Enter fullscreen mode Exit fullscreen mode

Next Steps

OpenClaw is running and accessible over HTTPS. From here you can:

  • Connect additional messaging channels through the control UI
  • Use /compact and /think commands to manage session memory
  • Back up persistent memory with tar -czvf openclaw-backup.tar.gz ~/.openclaw

If you prefer a one-click setup, OpenClaw is also available as a Vultr Marketplace app, no manual configuration required. See the Marketplace deployment guide for details.

For the full guide with additional tips, visit the original article on Vultr Docs.

Top comments (0)