Introducing Google Colab + Ollama Setup — Run LLMs with Free GPU and Access from Anywhere.
Turn Google Colab into your personal cloud LLM server and connect from your local machine easily.
Features:
- Run Ollama on Cloud — Use free GPU from Google Colab
- Public Access — Expose your Ollama server to the internet
- 128K Context Length — Handle massive prompts .
- Background Execution — Runs inside tmux session
- SSH Tunnel (Pinggy) — Instant public URL access
- Simple Setup — Just run notebook cells step-by-step
How It Works:
- Install dependencies + Ollama
- Start Ollama server in background
- Expose via SSH tunnel → Get public URL
- Connect from anywhere using
OLLAMA_HOST
Links:
Notebook/Repo: https://lnkd.in/gtVKNNWN
Let’s Discuss if You’re Interested !!!
Me: https://lnkd.in/gvJb5AeR
GitHub: https://lnkd.in/g-zYweC9
LinkedIn: https://lnkd.in/gM7F5Zrh
Discord: 0xaungkon
Top comments (0)