DEV Community

Cover image for Use Local LLM with Cursor and Ollama
0xkoji
0xkoji

Posted on • Edited on

Use Local LLM with Cursor and Ollama

requirements

  • Cursor is installed on your machine
  • Ollama is installed on your machine, and you have a model
  • ngrok is installed on your machine, and you have an ngrok account

Step1. Install Cursor

Go to https://www.cursor.com/ and download Cursor, then install it on your machine.

Step2. Install Ollama

Go to https://ollama.com/ and download Ollama, then install it on your machine.

Step3. Create an ngrok account and install ngrok

Go to https://ngrok.com/ and download ngrok, then install it on your machine.
Then set up ngrok.

Step4. Download(pull) a model

In this article, we'll use deepseek-r1 model
https://ollama.com/library/deepseek-r1
Open the Terminal app

# 7B-model
ollama pull deepseek-r1:latest
Enter fullscreen mode Exit fullscreen mode

Step5. Enable CORS and run ngrok

# macOS & Linux
export OLLAMA_ORIGINS="*"

# If you are using Windows
set OLLAMA_ORIGINS="*"

ngrok http 11434 --host-header="localhost:11434"
Enter fullscreen mode Exit fullscreen mode

Step6. Set OpenAI API Key

  1. Put the model name you pulled (in this case, the model is deepseek-r1:latest) and click Add model
  2. Put Ollama in API key
  3. Put the URL you get from ngrok command + /v1 The URL looks like https://ngrok_something/v1 config
  4. Click Save

Step7. Verify Ollama config

We are almost there.
Before clicking Verify button, we need to unselect all non-local models. So in this case, deepseek-r1:latest is the only selected model.
Then click Verify button.

Step8. Use a local model

This is the final step. Open Cursor and Chat (Ctrl/Cmd + l). Make sure that you select the model you added in Step 6 and send a prompt.

Top comments (3)

Collapse
 
macasas profile image
Dave

What subscription of cursor do you use?

On Pro, all i get is The model [WHATEVER_USED] does not work with your current plan or api key. error message.

My model responds fine in terminal for direct and ngrok/cloudtunnel testing, but not in cursor.

I think cursor does not like Pro subscription users switching to a free model. It also complains a lot about exceeding limits if I add anthropic or openai api key to use it directly. Seems they have found a money machine, and have ensured users keep paying through them.

Collapse
 
0xkoji profile image
0xkoji

I'm on teams plan

Collapse
 
zak66 profile image
Zak Bacon

Any solution for this? I'm on the Cursor Pro plus plan. Same error