<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Majushree H</title>
    <description>The latest articles on DEV Community by Majushree H (@majushree_h_326e2c758aea1).</description>
    <link>https://dev.to/majushree_h_326e2c758aea1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/majushree_h_326e2c758aea1"/>
    <language>en</language>
    <item>
      <title>How to Install Ollama with DeepSeek-r1 and Integrate it with Python on Windows</title>
      <dc:creator>Majushree H</dc:creator>
      <pubDate>Sat, 08 Feb 2025 09:25:19 +0000</pubDate>
      <link>https://dev.to/majushree_h_326e2c758aea1/how-to-install-ollama-with-deepseek-r1-and-integrate-it-with-python-on-windows-25i8</link>
      <guid>https://dev.to/majushree_h_326e2c758aea1/how-to-install-ollama-with-deepseek-r1-and-integrate-it-with-python-on-windows-25i8</guid>
      <description>&lt;h3&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;strong&gt;Sr. No.&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Topic&lt;/strong&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;Introduction&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;What is DeepSeek r1?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;Prerequisites&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3.1&lt;/td&gt;
&lt;td&gt;Basic Setup&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3.2&lt;/td&gt;
&lt;td&gt;Command Line Setup&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3.3&lt;/td&gt;
&lt;td&gt;Verify Installation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;Test the Model&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;5&lt;/td&gt;
&lt;td&gt;Python Integration Setup&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;6&lt;/td&gt;
&lt;td&gt;Launching Python Editors from Command Prompt&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;7&lt;/td&gt;
&lt;td&gt;Python Implementation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;8&lt;/td&gt;
&lt;td&gt;Running the Code in a Virtual Environment&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;9&lt;/td&gt;
&lt;td&gt;Conclusion&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In the world of AI, conversational models like &lt;strong&gt;DeepSeek-r1&lt;/strong&gt; by Ollama are revolutionizing natural language processing. This guide will walk you through the process of installing &lt;strong&gt;Ollama with DeepSeek-r1&lt;/strong&gt; on your Windows machine and integrating it with Python. Whether you're building intelligent applications or exploring advanced AI, this tutorial will help you set up DeepSeek-r1 to enhance your projects with powerful conversational capabilities. Let's get started!&lt;/p&gt;

&lt;h2&gt;
  
  
  What is DeepSeek r1
&lt;/h2&gt;

&lt;p&gt;DeepSeek-r1 is an advanced AI model developed by Ollama, offering state-of-the-art solutions for natural language processing (NLP). With the power of deep reasoning and problem-solving capabilities, it's perfect for applications such as content generation, chatbots, and AI-driven customer support systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Features of DeepSeek-r1&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Optimized for NLP&lt;/strong&gt;: DeepSeek-r1 is tailored for chat-based AI tasks, offering seamless natural language understanding.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Faster Inference&lt;/strong&gt;: Optimized for real-time responses, this model is perfect for chatbots and virtual assistants.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Higher Accuracy&lt;/strong&gt;: DeepSeek-r1 delivers refined performance in text generation, making it suitable for human-like conversational applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specialized AI Model&lt;/strong&gt;: Unlike other AI tools, DeepSeek-r1 is designed specifically for language-related tasks.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Use Cases&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Chatbots and Virtual Assistants&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Content Generation&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Question-Answer Systems&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Customer Support&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Follow these steps to install Ollama with DeepSeek-r1 on your Windows machine and get it running with Python.&lt;/p&gt;




&lt;h3&gt;
  
  
  Basic Setup
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Download Ollama&lt;/strong&gt;: Visit the official &lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Ollama website&lt;/a&gt; and download the software.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Download DeepSeek-r1&lt;/strong&gt;: Go to the Ollama website and download the DeepSeek-r1 model, ensuring it’s compatible with your system.&lt;/li&gt;
&lt;/ol&gt;




&lt;h3&gt;
  
  
  Command Line Setup
&lt;/h3&gt;

&lt;p&gt;Open your command line interface (Command Prompt or PowerShell) and run the following command to pull the DeepSeek-r1 model:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama pull deepseek-r1 &lt;span class="nt"&gt;--version&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;
  
  
  Verify Installation
&lt;/h3&gt;

&lt;p&gt;Run the following command to confirm that the installation was successful:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama list

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If everything is set up correctly, the list of installed models, including DeepSeek-r1, will be displayed.&lt;/p&gt;




&lt;h2&gt;
  
  
  Test the Model
&lt;/h2&gt;

&lt;p&gt;To test if DeepSeek-r1 is working as expected, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama run deepseek-r1

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can interact with the model by asking questions like "How are you?". To exit, simply type 'bye' or press &lt;strong&gt;Ctrl+Z&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Python Integration Setup
&lt;/h2&gt;

&lt;p&gt;Now let's set up Python to interact with Ollama.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Create a directory for your project:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd&lt;/span&gt;/
&lt;span class="nb"&gt;mkdir &lt;/span&gt;testDeep
&lt;span class="nb"&gt;cd &lt;/span&gt;testDeep

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Verify your Python version:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;--version&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Create a virtual environment:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;-m&lt;/span&gt; venv env1

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Activate the virtual environment:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;env1&lt;span class="se"&gt;\S&lt;/span&gt;cripts&lt;span class="se"&gt;\a&lt;/span&gt;ctivate.bat

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Install the Ollama package:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;ollama

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Optionally, open your preferred editor with:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;code .//for VS code editor

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Launching Python Editors from Command Prompt
&lt;/h2&gt;

&lt;p&gt;To open Python editors directly from the command line:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;For IDLE&lt;/strong&gt;: Type &lt;code&gt;idle&lt;/code&gt; or &lt;code&gt;python -m idlelib&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For PyCharm&lt;/strong&gt;: Type &lt;code&gt;pycharm&lt;/code&gt; (if installed in your system's PATH)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For Jupyter Notebook&lt;/strong&gt;: Type &lt;code&gt;jupyter notebook&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For Spyder&lt;/strong&gt;: Type &lt;code&gt;spyder&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Python Implementation
&lt;/h2&gt;

&lt;p&gt;Here's a simple Python script to interact with DeepSeek-r1:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ollama&lt;/span&gt;

&lt;span class="c1"&gt;# Initialize conversation with the model
&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ollama&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;deepseek-r1&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[{&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;role&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Hello, who are you?&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
    &lt;span class="p"&gt;}])&lt;/span&gt;

&lt;span class="c1"&gt;# Print the response
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;message&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Continue conversation
&lt;/span&gt;&lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;user_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;You: &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;user_input&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lower&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;exit&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;break&lt;/span&gt;

    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ollama&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;deepseek-r1&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;role&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;user_input&lt;/span&gt;
        &lt;span class="p"&gt;}])&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Assistant:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;message&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Running the Code in a Virtual Environment
&lt;/h2&gt;

&lt;p&gt;To execute your code in the virtual environment:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;strong&gt;Visual Studio Code&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Press &lt;strong&gt;Ctrl+Shift+P&lt;/strong&gt; and select &lt;strong&gt;Python: Select Interpreter&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Choose the &lt;strong&gt;env1&lt;/strong&gt; virtual environment to run your code.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Run&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can monitor your GPU performance using Task Manager.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;You've successfully installed Ollama with DeepSeek-r1 on your Windows and integrated it with Python. Whether you're working on an AI-powered project or exploring conversational AI, this setup provides you with a solid foundation to create intelligent applications.&lt;/p&gt;




</description>
      <category>ai</category>
      <category>tutorial</category>
      <category>python</category>
      <category>deepseek</category>
    </item>
  </channel>
</rss>
