DEV Community

Cover image for Running Claude Code with Ollama models (Local / Cloud)
Sushan
Sushan

Posted on

Running Claude Code with Ollama models (Local / Cloud)

Run Claude Code with Ollama (Local, Cloud, or Any Model)

This guide shows how to run Claude Code using Ollama, allowing you to use local models, cloud models, or any Ollama-supported model directly from your terminal.


Prerequisites

Make sure the following tools are installed:

  • Ollama
  • Claude Code

Install Ollama

If Ollama is not installed, you can install it using the commands below.

You can also follow this guide:

https://dev.to/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d

Windows (PowerShell)

irm https://ollama.com/install.ps1 | iex
Enter fullscreen mode Exit fullscreen mode

macOS / Linux

curl -fsSL https://ollama.com/install.sh | sh
Enter fullscreen mode Exit fullscreen mode

Verify installation:

ollama --version
Enter fullscreen mode Exit fullscreen mode

Install Claude Code

Windows (PowerShell)

irm https://claude.ai/install.ps1 | iex
Enter fullscreen mode Exit fullscreen mode

macOS / Linux

curl -fsSL https://claude.ai/install.sh | bash
Enter fullscreen mode Exit fullscreen mode

Verify installation:

claude --version
Enter fullscreen mode Exit fullscreen mode

Running Claude Code with Ollama

Once both tools are installed, you can start Claude Code through Ollama.

The commands work the same on Windows, macOS, and Linux.


Option 1: Launch and Select a Model

Run the command:

ollama launch claude
Enter fullscreen mode Exit fullscreen mode

This will open a model selection menu where you can choose a model using the arrow keys.

Ollama launch menu


Option 2: Launch with a Specific Model

You can also specify the model directly.

Example:

ollama launch claude --model kimi-k2.5:cloud
Enter fullscreen mode Exit fullscreen mode

Other examples:

ollama launch claude --model llama3
ollama launch claude --model qwen2.5
ollama launch claude --model kimi-k2.5:cloud
Enter fullscreen mode Exit fullscreen mode

Replace the model name with any model available in your Ollama environment.


Grant Folder Access

When Claude Code starts, it will ask permission to access the current folder.

Select Yes to allow Claude to read and modify files in the directory.

Allowing Claude access


Done

Claude Code will now start and connect to the selected model.

You can start interacting with your codebase immediately.

Claude Code running


Official Documentation

For more details, see the official docs:

https://docs.ollama.com/integrations/claude-code

Top comments (0)