Three months ago, I started working on a small idea that sat quietly inside my browser:
a tool to help developers write better Merge Request reviews without leaving the GitLab/GitHub web UI.
Today, Iβm excited to share two big milestones:
β
ThinkReview is now fully open source
βοΈ ThinkReview now supports Ollama (local LLMs)
If youβve been following my earlier Dev.to posts, you know ThinkReview originally launched as a simple Chrome extension powered by cloud LLMs. Itβs grown a lot since then β and the community asked for two things consistently:
Open Source
Local Model Support (Ollama)
Both are finally here.
β‘οΈ New Feature: Ollama Support (Local LLM)
Up to version 1.3.10, ThinkReview ran exclusively on cloud models.
Starting with v1.4.0, you can now connect ThinkReview to Ollama and run ANY local LLM you want:
- Qwen Coder
- Llama 3
- Codestral
- Deepseek
- Gemma
Anything Ollama supports
Why this matters:
- Your code never leaves your machine
- Zero API keys needed
- 100% free
- Great for companies with strict security/compliance
- Perfect for self-hosted GitLab users
βοΈ How to enable Ollama
One-time setup:
Stop any existing Ollama processes
Restart Ollama with flags that allow Chrome extension access
Go to ThinkReview settings β Test Connection
and youm should see the models already downloaded
and finally thats how it looks like
(supporting Gitlab and Azure Devops)
To install the extension (open source) you can install directly from Chrome store
https://chromewebstore.google.com/detail/thinkreview-ai-code-revie/bpgkhgbchmlmpjjpmlaiejhnnbkdjdjn
works on Edge too and any chromium browser




Top comments (0)