DEV Community

Mehmet Can
Mehmet Can

Posted on

Run Ollama Models on Google Colab (Free, No Local GPU)

If you don’t have a local GPU but still want to experiment with LLMs, this project might help.

I built a minimal setup to run Ollama models directly on Google Colab with almost zero friction.

What this repo does
• Installs Ollama inside Colab
• Runs models like Llama, Qwen, DeepSeek, CodeLlama
• Exposes the API so you can connect external tools
• Keeps the setup simple and reproducible

Why this exists

Most tutorials for running Ollama in Colab are either:
• Overcomplicated
• Broken or outdated
• Missing key steps (like tunneling or API access)

This repo removes that friction and gives you a working setup in minutes.

Use cases
• Testing coding models
• Building quick AI tools
• Running agents
• Prompt engineering experiments
• Connecting Ollama to external apps via tunnel

How to use

Open the notebook and run the cells step by step.

That’s it.

Repo

https://github.com/0x1881/collama

If you have suggestions or improvements, feel free to contribute.

Top comments (0)