Prerequisites
• Cursor installed
• LM Studio installed
• ngrok installed
• A local machine capable of running an LLM
Enter fulls...
For further actions, you may consider blocking this person and/or reporting abuse
Hi @0xkoji
Thanks,
For this great blog post.
I would be keen in having even more insights.
You show that you can reach the open-source model through Cursor but I'd be curious to have some insights on how it actually performs both in terms of latency and accuracy/actual coding performance.
I'm testing this, and this is slow as hell + the model seems unable to edit my files. It only can answer questions.
Did you try experimenting with actual coding tasks ?
What level of hardware do you have ?
What are your recommendations ?
Looking forward for your answers.
This no longer works. Free Cursor plan does not allow changing the model. The moment you try to select your own custom model, that is, you try to change anything from auto, it gives the following error
Hi, I tried many times but Cursor show a dialog:

And the server doesn't receive any request. Is it required to have a Pro subscription active?
Yup. Cursor saw the money sink and plugged it with a paywall. :) Unfortunately, custom models have been gated behind pro subscriptions. Correct me if I am wrong or things change.
I am using LM Studio to ascribe meta-data to a dataset that my client considers highly confidential. They don’t want the data passing over an API or using public servers (and thus non-private LLMs) out of privacy concerns. However, I would like to embed my LLM-code-ascription task into a data manipulation pipeline that occurs otherwise in Cursor.
@0xkoji — Can you comment on whether this use of ngrok preserves the level of privacy associated with using Cursor and LM Studio on the same machine but not connected?
Hi Mark, sorry for the slow response.
The reason I suggested using ngrok in my article is that Cursor does not currently allow users to communicate directly with local LLMs.
Given your client’s strict requirements, I believe using Cursor for this specific task will be very difficult. To answer your question: No, the combination of Cursor + ngrok + LM Studio does not preserve the same level of privacy as a purely offline environment.
While ngrok generates random URLs—making the chance of a third party guessing your endpoint extremely low—the risk is never zero. More importantly, because of how Cursor’s architecture works, your data may still pass through external infrastructure.
For your specific use case, I would recommend looking into open-source alternatives like Zed or other agentic coding tools such as OpenCode that allow for true local-to-local communication.
Personally, I have already switched to Zed as my primary editor.
Hi, Wondering why do you put ngrok in the middle? much easier and faster to use Claude --> Local LLM
I'm afraid that it is Cursor messing around, if you use your local host/ip for that, you will get this from Cursor when calling:
{"error":{"type":"client","reason":"ssrf_blocked","message":"connection to private IP is blocked","retryable":false}}That is why we need the ngrok, to have a domain.
Any reason you used glm-4.6v-flash? is it good?
no specific reason at that moment it was good.