DEV Community

Jay Elsheikh
Jay Elsheikh

Posted on

πŸš€ ThinkReview Is Now Open Source β€” and Now Supports Ollama for Local AI Code Review

Three months ago, I started working on a small idea that sat quietly inside my browser:
a tool to help developers write better Merge Request reviews without leaving the GitLab/GitHub web UI.

Today, I’m excited to share two big milestones:

βœ… ThinkReview is now fully open source
βš™οΈ ThinkReview now supports Ollama (local LLMs)

If you’ve been following my earlier Dev.to posts, you know ThinkReview originally launched as a simple Chrome extension powered by cloud LLMs. It’s grown a lot since then β€” and the community asked for two things consistently:

Open Source

Local Model Support (Ollama)

Both are finally here.

⚑️ New Feature: Ollama Support (Local LLM)

ThinkReview supporting Ollama for Code reviews

Up to version 1.3.10, ThinkReview ran exclusively on cloud models.

Starting with v1.4.0, you can now connect ThinkReview to Ollama and run ANY local LLM you want:

  • Qwen Coder
  • Llama 3
  • Codestral
  • Deepseek
  • Gemma

Anything Ollama supports

Why this matters:

  • Your code never leaves your machine
  • Zero API keys needed
  • 100% free
  • Great for companies with strict security/compliance
  • Perfect for self-hosted GitLab users

βš™οΈ How to enable Ollama

One-time setup:

Instructions to setup ollama with thinkreview

Stop any existing Ollama processes

Restart Ollama with flags that allow Chrome extension access

Go to ThinkReview settings β†’ Test Connection
and youm should see the models already downloaded

Choosing any model to work with ThinkReview

and finally thats how it looks like
(supporting Gitlab and Azure Devops)

Code Review in the browser Gitlab Mrs

To install the extension (open source) you can install directly from Chrome store
https://chromewebstore.google.com/detail/thinkreview-ai-code-revie/bpgkhgbchmlmpjjpmlaiejhnnbkdjdjn

works on Edge too and any chromium browser

Top comments (0)