DEV Community

lastlink
lastlink

Posted on

Securely Exposing LM Studio with Nginx Proxy + Auth + Manage loaded models

Most guides highlight Cloudflare tunnels or similar services for exposing local AI models. Tunnels are certainly convenient, but if you leave them open without any authentication, they can become insecure. In those cases, a VPN is often recommended to provide protection. Alternatively, you can set up a direct, secure, and controlled proxy for LM Studio that adds its own authentication layer — giving you safe access without relying on a VPN.

This article shows how to set up an Nginx proxy with authentication so you can safely expose LM Studio to the internet and use it anywhere, including on your phone.


🚀 Project Setup

  1. Download LM Studio

    Grab the latest release from the LM Studio website and install it on your machine. Turn on server and change port to 1250

  2. Clone the repo and start the dev server:

git clone https://github.com/funktechno/lm_studio_manager.git
cd lm_studio_manager
npm install
cp .env.sample .env
npm run dev
Enter fullscreen mode Exit fullscreen mode

LM Studio Preview

Model Loading

  • use this tool to manage loaded models

🐳 Run Nginx Proxy in Docker

We’ll run Nginx in a container, mounting a template config and an .htpasswd file for Basic Auth.

⚠️ Important: Remember to change my-secret-key and update your .htpasswd file with your own credentials before running exposing this to the internet!

  • linux
docker run -d \
  --name lmstudio-proxy \
  -p 1234:1234 \
  -e API_SECRET_KEY=my-secret-key \
  -v .htpasswd:/etc/nginx/.htpasswd:ro \
  -v ./nginx.conf.template:/etc/nginx/templates/nginx.conf.template:ro \
  nginx:latest
Enter fullscreen mode Exit fullscreen mode
  • windows
docker run -d `
  --name lmstudio-proxy `
  -p 1234:1234 `
  -e API_SECRET_KEY=my-secret-key `
  -v ${PWD}\nginx.conf.template:/etc/nginx/templates/nginx.conf.template:ro `
  -v ${PWD}/.htpasswd:/etc/nginx/.htpasswd:ro `
  nginx:latest
Enter fullscreen mode Exit fullscreen mode

🔑 Generating a .htpasswd File

You can generate a password file using this website

📱 Access Anywhere

Once the proxy is running, you can access LM Studio securely:

Web UI:

API:

Because this setup uses HTTPS + Basic Auth/Bearer tokens, you can connect from any device — laptop, tablet, or phone — without needing a VPN.

⚙️ Sample Continue.dev vscode extension Config
Here’s a sample config.yml for Continue.dev pointing to your proxy:

name: Local Config
version: 1.0.0
schema: v1
models:
  - name: Local LM Studio Server
    provider: lmstudio
    model: ministral-8b-instruct-2410
    # specify model loaded
    # codegemma-7b-it
    # qwen/qwen2.5-coder-14b - slow, but not super slow
    # deepseek/deepseek-r1-0528-qwen3-8b
    # deepseek-coder-33b-instruct
    # vertex-qwen-7b-ollama-coder-shadcn-10epoch-v1.3
    # ministral-8b-instruct-2410
    # nginx proxy auth
    apiKey: my-secret-key
    api_base: "https://yourdomain/"
    roles:
      - chat
      - edit
      - apply
      - autocomplete
      - summarize
Enter fullscreen mode Exit fullscreen mode

✅ Summary

Download repo → npm run dev

Run Docker → Nginx proxy with auth

Expose securely → Basic Auth for /manage/, Bearer token for API

Use anywhere → Works on any device, no VPN required

With this setup, you create a secure, self‑hosted gateway to LM Studio — no external dependencies and no VPN needed, just a proxy you fully control.

Top comments (0)