DEV Community

prashant rana
prashant rana

Posted on

17

Run deepseek locally with webUI interface

deepseek is awesome, adding a way to run deepseek models locally with webUI

here is docker-compose.yaml file content


services:
  ollama:
    image: ollama/ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_models:/root/.ollama
    networks:
      - ollama-net

  open-web-ui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "8080:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
    depends_on:
      - ollama
    networks:
      - ollama-net
    volumes:
      - open-webui:/app/backend/data

volumes:
  ollama_models:
  open-webui:

networks:
  ollama-net:
Enter fullscreen mode Exit fullscreen mode

in your project folder, add this docker-compose.yaml file .
run command docker-compose up -d , pull all require images.

then to install a specific model locally, run the command

docker-compose exec ollama ollama pull deepseek-coder:6.7b

here i am installing deepseek-coder:6.7b model . take model from https://ollama.com/library

wait for 10-15 sec to reload.
go to http://localhost:8080 to start using it.

Sentry image

See why 4M developers consider Sentry, “not bad.”

Fixing code doesn’t have to be the worst part of your day. Learn how Sentry can help.

Learn more

Top comments (0)

Billboard image

Create up to 10 Postgres Databases on Neon's free plan.

If you're starting a new project, Neon has got your databases covered. No credit cards. No trials. No getting in your way.

Try Neon for Free →