The core issue was that OpenClaw’s built-in “Ollama” provider is designed specifically for local servers.
When you try to point it at a cloud URL, it tries to “discover” models using local-only commands,
which causes the Gateway service to crash.
The Solution: Use the OpenAI Provider
Instead of using the ollama provider type, we used the openai-completions type.
This tells OpenClaw to treat Ollama Cloud as a standard cloud API, bypassing the local discovery logic.
Correct Configuration (~/.openclaw/openclaw.json)
Ensure your models.providers section looks like this:
-
/v1is what makes it OpenAI-compatible, as opposed to/api. -
ollama-cloudinstead ofollamabecauseollamais reserved. Does it have to sayollama-cloud? No, writebutts-cloudfor all I care.
"models": {
"providers": {
"ollama-cloud": {
"baseUrl": "https://ollama.com/v1",
"apiKey": "YOUR_OLLAMA_API_KEY",
"api": "openai-completions",
"models": [
{
"id": "kimi-k2.5:cloud",
"name": "Kimi K2.5 Cloud",
"contextWindow": 128000,
"maxTokens": 4096
},
{
"id": "minimax-m2.5:cloud",
"name": "MiniMax M2.5 Cloud",
"contextWindow": 128000,
"maxTokens": 4096
},
{
"id": "glm-5:cloud",
"name": "GLM-5 Cloud",
"contextWindow": 128000,
"maxTokens": 4096
}
]
}
}
}
Key Troubleshooting Tips
- The “76B5208” Code: This is just the version number of OpenClaw (2026.1.30), not an error code. If you see it, the program is running!
- The “Invalid Input” Error: This happens if the
apifield is set to anything other thanopenai-completions. - Using
openclawafter this may act weird: Turn off the gateway withopenclaw gateway stopand restart it withopenclaw gateway. - Starting a Chat: Use the
--session-idflag to give your conversation a name so OpenClaw knows where to send the message.
Example commands to start openclaw:
-
openclaw agent --message "testing 123" --session-id test-cloud(You can replacetest-cloudwith whatever) -
openclaw tui
Sample Config
{
"meta": {
... Bunch of shit you don't need to touch ...
},
"wizard": {
... More autogenerated stuff ...
},
"models": {
"providers": {
"ollama-cloud": {
"baseUrl": "https://ollama.com/v1",
"apiKey": "<GET YOUR OLLAMA API KEY AND PUT IT HERE>",
"api": "openai-completions",
"models": [
{
"id": "kimi-k2.5:cloud",
"name": "Kimi K2.5 Cloud",
"contextWindow": 128000,
"maxTokens": 4096
},
{
"id": "minimax-m2.5:cloud",
"name": "MiniMax M2.5 Cloud",
"contextWindow": 128000,
"maxTokens": 4096
},
{
"id": "glm-5:cloud",
"name": "GLM-5 Cloud",
"contextWindow": 128000,
"maxTokens": 4096
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "ollama-cloud/kimi-k2.5:cloud"
},
...Whatever OpenClaw had already put here, keep it...
}
},
"messages": {
... No touchy ...
},
"commands": {
... Look I just want you to know where your shit is supposed to go...
},
"hooks": {
...
},
"channels": {
... Some of these depend on the things you chose during onboarding ...
},
"gateway": {
...
},
"tailscale": {
...
}
},
"skills": {
...
},
"plugins": {
...
}
}
crossposted on GitHub: https://gist.github.com/S4GU4R0/f7fc15eb5deeb6bd0b84459b22f7fc23 by Saguaro Prole
Top comments (0)