<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: re-ten</title>
    <description>The latest articles on DEV Community by re-ten (@basstimam).</description>
    <link>https://dev.to/basstimam</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/basstimam"/>
    <language>en</language>
    <item>
      <title>DeepSeek-R1 on Cursor with Ollama</title>
      <dc:creator>re-ten</dc:creator>
      <pubDate>Thu, 30 Jan 2025 03:53:29 +0000</pubDate>
      <link>https://dev.to/basstimam/deepseek-r1-on-cursor-with-ollama-1pph</link>
      <guid>https://dev.to/basstimam/deepseek-r1-on-cursor-with-ollama-1pph</guid>
      <description>&lt;p&gt;So guys, there are many options using local llm but the DeepSeek-R1 is drop a weeks ago. If you want use the ollama/local llm in cursor i got u.&lt;/p&gt;

&lt;p&gt;First u need a ollama, &lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Ollama&lt;/a&gt; then after installing it u need cors for ollama, it required or cursor give &lt;strong&gt;403 Forbidden&lt;/strong&gt; as u can see we need define &lt;code&gt;OLLAMA_ORIGINS&lt;/code&gt; in windows environment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6fsqokpdx7d195yfsbn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6fsqokpdx7d195yfsbn.png" alt="Image description" width="570" height="51"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ok, next we need the deepseek-r1 models, i try deepseek-r1:8b because this model have good benchmark the model running on my pc with Nvidia RTX 3070 8GB(enough vram i got 60-70t/s). We can use &lt;/p&gt;

&lt;p&gt;&lt;code&gt;ollama run deepseek-r1:8b&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;then the models start the downloading, if that clear we can quit the ollama via tray icons windows or what ever, we need to close for restarting ollama because we define the cors.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6wt6nobb94bnrq6wnb8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6wt6nobb94bnrq6wnb8.png" alt="Image description" width="281" height="157"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;then u can run ollama via start menu.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1f37f5br2dt8bkfsh3y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1f37f5br2dt8bkfsh3y.png" alt="Image description" width="759" height="724"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, ollama serve endpoint &lt;code&gt;http://127.0.0.1:11434&lt;/code&gt; but if u direct using the endpoint to cursor i cant be used. so we need &lt;a href="https://ngrok.com/" rel="noopener noreferrer"&gt;ngrok&lt;/a&gt;. U can download and login it, then they instruct u to login via auth token.&lt;/p&gt;

&lt;p&gt;Next we need ngrok to give public url for ollama.&lt;br&gt;
&lt;code&gt;.\ngrok.exe http 11434 --host-header="localhost:11434"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Like this&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuibhlsfaauyyx0kciy7p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuibhlsfaauyyx0kciy7p.png" alt="Image description" width="601" height="101"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then we got the endpoint for OpenAI Public URL&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F92fbfeeby9pkmmpprwne.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F92fbfeeby9pkmmpprwne.png" alt="Image description" width="800" height="220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;U can check if ur endpoint is active&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4x4x707ckd16lj60187v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4x4x707ckd16lj60187v.png" alt="Image description" width="527" height="86"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ok, we move to cursor&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faaij7od3uem7evumefik.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faaij7od3uem7evumefik.png" alt="Image description" width="800" height="485"&gt;&lt;/a&gt;&lt;br&gt;
We need define model what we use in cursor, u can check with &lt;code&gt;ollama list&lt;/code&gt; for list of models u have.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj4jcc4o30s0yxkals848.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj4jcc4o30s0yxkals848.png" alt="Image description" width="800" height="261"&gt;&lt;/a&gt;&lt;br&gt;
On OpenAI Key use ur public url &lt;code&gt;https://xxxxxx.ngrok-free.app&lt;/code&gt; with api key &lt;code&gt;ollama&lt;/code&gt; the u done.&lt;/p&gt;

&lt;p&gt;If the step done, we can go try some model with cursor chat.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3891uq4anz3a1hl7m7ze.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3891uq4anz3a1hl7m7ze.png" alt="Image description" width="800" height="426"&gt;&lt;/a&gt;&lt;br&gt;
As u can see, the local llm works properly at some case it not support for compose because cursor only allow antrophic and gpt models.&lt;/p&gt;

</description>
      <category>cursor</category>
      <category>ai</category>
      <category>deepseek</category>
      <category>llm</category>
    </item>
  </channel>
</rss>
