Want to use a Claude-like coding assistant without paying API costs?
In this guide, I’ll show you how to run it (step-by-step) locally using Ollama and Claude Code.
🧠 What You’ll Build
- Local LLM setup (no API cost)
- Claude-style coding assistant
- Fully offline workflow (optional)
⚙️ Step-by-Step Setup
1. Install Ollama
Ollama is the engine that runs LLMs locally.
curl -fsSL https://ollama.com/install.sh | sh
Verify:
ollama -v
2. Install Claude CLI
Follow:
https://code.claude.com/docs/en/quickstart
If you see claude is not installed, you missed this step.
3. Install Claude Code
npm install -g @anthropic-ai/claude-code
4. Pull a Coding Model
Command :
ollama pull qwen2.5-coder:7b
You will see :
5. Configure Environment
Add to your ~/.bashrc or ~/.zshrc:
export PATH="$HOME/.local/bin:$PATH"
export ANTHROPIC_BASE_URL=http://localhost:11434/v1
Reload:
source ~/.bashrc
You will see :
6. Launch Claude
ollama launch claude
Now Claude will use your local Ollama model.
Free Claude Ready to Use
During my exploration, I got my model was confuse and the answer was not my want.
🔄 Try Alternative Models
ollama pull glm-5:cloud
Then:
ollama launch claude
Choose the new model. The New Model was more precise and answer the question well.
💻 Example Prompts
Based on the current project purpose that for explain about claude playground, I want you to trigger technical architect skill here to create boiler plate for golang, to create simple, concise and easy to learn. Please also put the README file about this project. Before you write and work for it, I want you to create a terminal wireframe to show me exactly, using the ASCII art, where you can be able to create a terminal wireframe of exactly what it is going to look like for this playground before we do implementation.
If you want to know about the history of the conversation and the project that created by Free Claude, you can the github below






Top comments (0)