DEV Community

Cover image for Get OpenClaw working with Ollama Cloud (no server management)
Saguaro Prole
Saguaro Prole

Posted on

Get OpenClaw working with Ollama Cloud (no server management)

The core issue was that OpenClaw’s built-in “Ollama” provider is designed specifically for local servers.
When you try to point it at a cloud URL, it tries to “discover” models using local-only commands,
which causes the Gateway service to crash.

The Solution: Use the OpenAI Provider

Instead of using the ollama provider type, we used the openai-completions type.
This tells OpenClaw to treat Ollama Cloud as a standard cloud API, bypassing the local discovery logic.

Correct Configuration (~/.openclaw/openclaw.json)

Ensure your models.providers section looks like this:

  • /v1 is what makes it OpenAI-compatible, as opposed to /api.
  • ollama-cloud instead of ollama because ollama is reserved. Does it have to say ollama-cloud? No, write butts-cloud for all I care.
"models": {
  "providers": {
    "ollama-cloud": {
      "baseUrl": "https://ollama.com/v1",
      "apiKey": "YOUR_OLLAMA_API_KEY",
      "api": "openai-completions",
      "models": [
        {
          "id": "kimi-k2.5:cloud",
          "name": "Kimi K2.5 Cloud",
          "contextWindow": 128000,
          "maxTokens": 4096
        },
        {
          "id": "minimax-m2.5:cloud",
          "name": "MiniMax M2.5 Cloud",
          "contextWindow": 128000,
          "maxTokens": 4096
        },
        {
          "id": "glm-5:cloud",
          "name": "GLM-5 Cloud",
          "contextWindow": 128000,
          "maxTokens": 4096
        }
      ]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Key Troubleshooting Tips

  • The “76B5208” Code: This is just the version number of OpenClaw (2026.1.30), not an error code. If you see it, the program is running!
  • The “Invalid Input” Error: This happens if the api field is set to anything other than openai-completions.
  • Using openclaw after this may act weird: Turn off the gateway with openclaw gateway stop and restart it with openclaw gateway.
  • Starting a Chat: Use the --session-id flag to give your conversation a name so OpenClaw knows where to send the message.

Example commands to start openclaw:

  • openclaw agent --message "testing 123" --session-id test-cloud (You can replace test-cloud with whatever)
  • openclaw tui

Sample Config

{
  "meta": {
    ... Bunch of shit you don't need to touch ...
  },
  "wizard": {
    ... More autogenerated stuff ...
  },
  "models": {
    "providers": {
      "ollama-cloud": {
        "baseUrl": "https://ollama.com/v1", 
        "apiKey": "<GET YOUR OLLAMA API KEY AND PUT IT HERE>",
        "api": "openai-completions",
        "models": [
          {
            "id": "kimi-k2.5:cloud",
            "name": "Kimi K2.5 Cloud",
            "contextWindow": 128000,
            "maxTokens": 4096
          },
          {
            "id": "minimax-m2.5:cloud",
            "name": "MiniMax M2.5 Cloud",
            "contextWindow": 128000,
            "maxTokens": 4096
          },
          {
            "id": "glm-5:cloud",
            "name": "GLM-5 Cloud",
            "contextWindow": 128000,
            "maxTokens": 4096
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "ollama-cloud/kimi-k2.5:cloud"
      },
      ...Whatever OpenClaw had already put here, keep it...
    }
  },
  "messages": {
    ... No touchy ...
  },
  "commands": {
    ... Look I just want you to know where your shit is supposed to go...
  },
  "hooks": {
   ...
  },
  "channels": {
    ... Some of these depend on the things you chose during onboarding ...
  },
  "gateway": {
    ...
  },
    "tailscale": {
      ...
    }
  },
  "skills": {
    ...
  },
  "plugins": {
   ...
  }
}
Enter fullscreen mode Exit fullscreen mode

crossposted on GitHub: https://gist.github.com/S4GU4R0/f7fc15eb5deeb6bd0b84459b22f7fc23 by Saguaro Prole

Top comments (0)