<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: prashant rana</title>
    <description>The latest articles on DEV Community by prashant rana (@sahasrara62).</description>
    <link>https://dev.to/sahasrara62</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sahasrara62"/>
    <language>en</language>
    <item>
      <title>Run deepseek locally with webUI interface</title>
      <dc:creator>prashant rana</dc:creator>
      <pubDate>Tue, 28 Jan 2025 21:06:42 +0000</pubDate>
      <link>https://dev.to/sahasrara62/run-deepseek-locally-with-webui-interface-4n65</link>
      <guid>https://dev.to/sahasrara62/run-deepseek-locally-with-webui-interface-4n65</guid>
      <description>&lt;p&gt;deepseek is awesome, adding a way to run deepseek models locally with webUI&lt;/p&gt;

&lt;p&gt;here is &lt;code&gt;docker-compose.yaml&lt;/code&gt; file content&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
services:
  ollama:
    image: ollama/ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_models:/root/.ollama
    networks:
      - ollama-net

  open-web-ui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "8080:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
    depends_on:
      - ollama
    networks:
      - ollama-net
    volumes:
      - open-webui:/app/backend/data

volumes:
  ollama_models:
  open-webui:

networks:
  ollama-net:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;in your project folder, add this docker-compose.yaml file . &lt;br&gt;
run command &lt;code&gt;docker-compose up -d&lt;/code&gt; , pull all require images. &lt;/p&gt;

&lt;p&gt;then to install a specific model locally, run the command&lt;br&gt;
&lt;br&gt;
 &lt;code&gt;docker-compose exec ollama ollama pull deepseek-coder:6.7b&lt;/code&gt;&lt;br&gt;
&lt;br&gt;
 here i am installing deepseek-coder:6.7b model . take model from  &lt;a href="https://ollama.com/library" rel="noopener noreferrer"&gt;https://ollama.com/library&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;wait for 10-15 sec to reload. &lt;br&gt;
go to &lt;code&gt;http://localhost:8080&lt;/code&gt; to start using it. &lt;/p&gt;

</description>
      <category>docker</category>
      <category>deepseek</category>
      <category>openwebui</category>
    </item>
  </channel>
</rss>
