DEV Community

Cover image for Use Cursor with Local LLM and LM Studio

Use Cursor with Local LLM and LM Studio

0xkoji on January 18, 2026

Prerequisites • Cursor installed • LM Studio installed • ngrok installed • A local machine capable of running an LLM Enter fulls...
Collapse
 
vianneymi profile image
Vianney Mixtur

Hi @0xkoji

Thanks,
For this great blog post.
I would be keen in having even more insights.

You show that you can reach the open-source model through Cursor but I'd be curious to have some insights on how it actually performs both in terms of latency and accuracy/actual coding performance.

I'm testing this, and this is slow as hell + the model seems unable to edit my files. It only can answer questions.

Did you try experimenting with actual coding tasks ?
What level of hardware do you have ?
What are your recommendations ?

Looking forward for your answers.

Collapse
 
gauravphoenix profile image
Gaurav Kumar

This no longer works. Free Cursor plan does not allow changing the model. The moment you try to select your own custom model, that is, you try to change anything from auto, it gives the following error

Free plans can only use Auto. Switch to Auto or upgrade plans to continue.

Collapse
 
lball_8b2088e707f profile image
lball

Hi, I tried many times but Cursor show a dialog:

And the server doesn't receive any request. Is it required to have a Pro subscription active?

Collapse
 
johannes_jamroszczyk_f659 profile image
Johannes Jamroszczyk • Edited

Yup. Cursor saw the money sink and plugged it with a paywall. :) Unfortunately, custom models have been gated behind pro subscriptions. Correct me if I am wrong or things change.

Collapse
 
mark_macdonald_85ee33cda1 profile image
Mark Macdonald

I am using LM Studio to ascribe meta-data to a dataset that my client considers highly confidential. They don’t want the data passing over an API or using public servers (and thus non-private LLMs) out of privacy concerns. However, I would like to embed my LLM-code-ascription task into a data manipulation pipeline that occurs otherwise in Cursor.

@0xkoji — Can you comment on whether this use of ngrok preserves the level of privacy associated with using Cursor and LM Studio on the same machine but not connected?

Collapse
 
0xkoji profile image
0xkoji

Hi Mark, sorry for the slow response.

The reason I suggested using ngrok in my article is that Cursor does not currently allow users to communicate directly with local LLMs.

Given your client’s strict requirements, I believe using Cursor for this specific task will be very difficult. To answer your question: No, the combination of Cursor + ngrok + LM Studio does not preserve the same level of privacy as a purely offline environment.

While ngrok generates random URLs—making the chance of a third party guessing your endpoint extremely low—the risk is never zero. More importantly, because of how Cursor’s architecture works, your data may still pass through external infrastructure.

For your specific use case, I would recommend looking into open-source alternatives like Zed or other agentic coding tools such as OpenCode that allow for true local-to-local communication.

Personally, I have already switched to Zed as my primary editor.

Collapse
 
kamalmost profile image
KamalMostafa

Hi, Wondering why do you put ngrok in the middle? much easier and faster to use Claude --> Local LLM

Collapse
 
skozz profile image
Angel R. Molina

I'm afraid that it is Cursor messing around, if you use your local host/ip for that, you will get this from Cursor when calling:

{"error":{"type":"client","reason":"ssrf_blocked","message":"connection to private IP is blocked","retryable":false}}

Collapse
 
marcosfreitas profile image
marcosfreitas

That is why we need the ngrok, to have a domain.

Collapse
 
nareshipme profile image
nareshipme

Any reason you used glm-4.6v-flash? is it good?

Collapse
 
0xkoji profile image
0xkoji

no specific reason at that moment it was good.