DEV Community

tomato
tomato

Posted on • Originally published at ppppp.dev on

Local Lock Down Lobe Chat Setup

Here is a guide on how to setup a more lock down lobe-chat to use locally with ollama

Local Lock Down Lobe Chat Setup

Lobe-chat: https://github.com/lobehub/lobe-chat

Ollam: https://github.com/ollama/ollama

Podman command running the lobe-chat locally with ollama

Note: This is using podman cli version 4.9.3

read -sp "Lobe-chat Access Code >>" ACCESS_CODE && echo -n "$ACCESS_CODE" | podman secret create access_code - && podman run -d --name lobe-chat \
    --secret access_code,type=env,target=ACCESS_CODE \
    --network slirp4netns:allow_host_loopback=true,outbound_addr=127.0.0.1 \
    --add-host=host.containers.internal:host-gateway -p 127.0.0.1:3210:3210 \
    -e OLLAMA_PROXY_URL=http://host.containers.internal:11434 \
    -e FEATURE_FLAGS="-market,-plugins,-check_updates,-openai_api_key,-openai_proxy_url" \
    --restart=always \
    docker.io/lobehub/lobe-chat; podman secret rm access_code
Enter fullscreen mode Exit fullscreen mode

Edit: includes in terminal access code for the lobe-chat using podman secret

What does this command do:

  • -d Detach
  • --name: Podman container name so it doesn't randomly generate
  • --network Network interface that the container will be using, using slirp4netns:allow_host_loopback=true to loop back connect to directly talk to host. If you are on 5.0+ you should use pasta. outbound_addr forces the outgoing address to be 127.0.0.1. Doc: https://docs.podman.io/en/v5.0.1/markdown/podman-run.1.html#network-mode-net
  • --add-host Creates a mapping in the container /etc/hosts files to podman's host-gateway which is what podman use to talk to your host
  • -p 127.0.0.1:3210:3210 Basic host to container mapping with limiting the access to the container to only the host by using 127.0.0.1
  • --restart=always restarting the container when exit
  • -e

Testing container network access:

In order to know that we have setup the container in a isolate environment to only send request to our local services such as ollama and the our port host 3210

We will need to test out if the container can reach outside of the internet.

The image that lobe-chat comes is very minimal, so I'm using nc or netcat to test out network request

nc -v 8.8.8.8

Enter fullscreen mode Exit fullscreen mode

This will try to send connection to the ip address 8.8.8.8 and will verbose message of the request

nc -z -v 8.8.8.8 1-1000

Enter fullscreen mode Exit fullscreen mode

This does pretty much the same, but with port scanning

Here is an example of me running the command using Podman Desktop:

Local Lock Down Lobe Chat Setup
podman command testing

Locking down Front-end

Now that we have finish locking down the container itself, we will need to lock down the front-end renders that happen when lobe-chat serve us the ui content.

This is a basic setup that I made using the dev tool in the front-end to lock it down

Local Lock Down Lobe Chat Setup
dev tool blocking request

Of course this does not fully lock down anything, other than just blocking a few known request as I scan through the network tab.

In order to do lock down render on the front-end fully, meaning the only request that will happen on the front-end will be coming from the container and nothing is going out from the tab.

We will need to do this at a browser level, either you have a browser extension that blocks request at this tab of this specific URL that you are using lobe-chat for or you will need to edit the network setting of your browser.

You can edit the proxy setting within your browser to make sure that all request will default to a local port that will just fail, and whitelist out the lobe-chat's port

Here is an example of my Firefox setting:

Local Lock Down Lobe Chat Setup

Result

Local Lock Down Lobe Chat Setup

Reference:

[

What is Podman? — Podman documentation

Local Lock Down Lobe Chat SetupPodman

Local Lock Down Lobe Chat Setup
](https://docs.podman.io/en/latest/?ref=ppppp.dev)

[

Get started with LobeChat · LobeChat Docs · LobeHub

Explore the exciting features in LobeChat, including Vision Model, TTS & STT, Local LLMs, and Multi AI Providers. Discover more about Agent Market, Plugin System, and Personalization.

Local Lock Down Lobe Chat SetupLobeHubLobeHub

Local Lock Down Lobe Chat Setup
](https://lobehub.com/docs/usage/start?ref=ppppp.dev)

Top comments (0)