<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: exnerdev</title>
    <description>The latest articles on DEV Community by exnerdev (@exnerdev).</description>
    <link>https://dev.to/exnerdev</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/exnerdev"/>
    <language>en</language>
    <item>
      <title>Build your next AI Tech Startup with DeepSeek</title>
      <dc:creator>exnerdev</dc:creator>
      <pubDate>Mon, 03 Feb 2025 18:00:09 +0000</pubDate>
      <link>https://dev.to/exnerdev/build-your-next-ai-tech-startup-with-deepseek-180p</link>
      <guid>https://dev.to/exnerdev/build-your-next-ai-tech-startup-with-deepseek-180p</guid>
      <description>&lt;p&gt;Over the past 2 weeks, a new contender in the AI Revolution has taken over seamingly from nowhere, DeepSeek, with it's V3 and R1 models, two LLMS that rival OpenAI. With R1 costing just $5.6 million and 6 weeks of work, was created by none other than China! It's not only just as good, if not better than GPT-4o and o1, but free and open source, which allows us developers for the first time ever to run &lt;em&gt;actually&lt;/em&gt; powerful LLM's locally/offline. This blog post is a deep dive on what exactly is DeepSeek, why you should even care to begin with, how they were able to pull it off, and most importantly, how &lt;em&gt;you&lt;/em&gt; can take advantage of it to build your next million dollar tech startup.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is DeepSeek?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.deepseek.com/" rel="noopener noreferrer"&gt;DeepSeek&lt;/a&gt; is a Chinese AI company specializing in building open-source large language models founded just 18 months ago in July of 2023. DeepSeek R1 isn't their first LLM, but it's their first reasoning model, comparable to &lt;a href="https://openai.com/o1/" rel="noopener noreferrer"&gt;OpenAI's o1&lt;/a&gt; model.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn0p6e6iao6j30moqhzhy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn0p6e6iao6j30moqhzhy.png" alt="DeepSeek R1 vs OepnAI o1 vs DeepSeek R1 32B vs OpenAI o1 mini vs DeepSeek V3. R1 performs best on AIME 2024, MATH, and SWE-Bench Verified. R1 is equal to o1 in Codeforces. o1 beats R1 in GPQA Diamond and MMLU." width="800" height="470"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Benchmarks of reasoning models. &lt;a href="https://github.com/deepseek-ai/DeepSeek-R1/blob/main/DeepSeek_R1.pdf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;
&lt;/center&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgbo1yhm7zvdufazivo34.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgbo1yhm7zvdufazivo34.png" alt="DeepSeek V3 vs DeepSeek V2.5 vs Qwen 2.5 vs LLama 3.1 vs GPT-4o vs Claude 3.5. V3 crushes the competition in all aspects" width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Benchmarks of general-purpose models. &lt;a href="https://github.com/deepseek-ai/DeepSeek-V3/blob/main/DeepSeek_V3.pdf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;
&lt;/center&gt;

&lt;h2&gt;
  
  
  Why should I care?
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;It's just as good, if not better than OpenAI&lt;/li&gt;
&lt;li&gt;It's completely free to use &lt;a href="https://chat.deepseek.com/" rel="noopener noreferrer"&gt;on their official website&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://api-docs.deepseek.com/" rel="noopener noreferrer"&gt;DeepSeek's API&lt;/a&gt; is over 96% cheaper than OpenAI's (Calculated using &lt;a href="https://api-docs.deepseek.com/quick_start/pricing" rel="noopener noreferrer"&gt;R1's pricing&lt;/a&gt; input token cache miss, and comparing it to &lt;a href="https://openai.com/api/pricing/" rel="noopener noreferrer"&gt;o1's pricing&lt;/a&gt; input tokens parameter)&lt;/li&gt;
&lt;li&gt;It's 100% Free and Open Source, released under the MIT License (&lt;a href="https://github.com/deepseek-ai/DeepSeek-R1?tab=MIT-1-ov-file" rel="noopener noreferrer"&gt;Source&lt;/a&gt;), allowing you to run it locally on your computer. (We will learn how in this post) You can &lt;a href="https://github.com/deepseek-ai/DeepSeek-R1/" rel="noopener noreferrer"&gt;check it out on GitHub&lt;/a&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How did they pull it off?
&lt;/h2&gt;

&lt;p&gt;There are tons of different ways they did this. I've chosen some of the most important to highlight. Note that these are highly, &lt;em&gt;highly&lt;/em&gt; over-simplified. If you want a more complex deep-dive into how these work, check out the sources.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Heavy Low-Level Optimization
&lt;/h3&gt;

&lt;p&gt;Due to &lt;a href="https://www.bis.gov/press-release/commerce-strengthens-export-controls-restrict-chinas-capability-produce-advanced" rel="noopener noreferrer"&gt;US Government restrictions on selling high-level chips to China&lt;/a&gt;, DeepSeek, a Chinese company, didn't have access to the most powerful NVIDIA cards (Ex. NVIDIA H100s) to train their models on, which meant they had to figure out how to optimize the chips that they already had (NVIDIA H800s). To summarize, they used Mixture-of-Experts (MoE) and Multi-head Latent Attention (MLA) technology in order to maximize GPU performance. (&lt;a href="https://arxiv.org/abs/2405.04434" rel="noopener noreferrer"&gt;Source&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F89nuo4hj2pe7dkhsx0fz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F89nuo4hj2pe7dkhsx0fz.png" alt="Basic Architecture of DeepSeek V3" width="800" height="641"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Basic Architecture of DeepSeek V3. &lt;a href="https://github.com/deepseek-ai/DeepSeek-V3/blob/main/DeepSeek_V3.pdf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;
&lt;/center&gt;

&lt;h3&gt;
  
  
  2. Only Train what's Necessary
&lt;/h3&gt;

&lt;p&gt;Typically, training parts of an AI model usually meant updating the whole thing, even if some parts didn't contribute anything, which lead to a massive waste of resources. To solve this, they introduced an Auxiliary-Loss-Free (ALS) Load Balancing. The ALS Load Balancing works by introducing a &lt;strong&gt;bias factor&lt;/strong&gt; to prevent overloading one chip, while under-utilizing another (&lt;a href="https://github.com/deepseek-ai/DeepSeek-V3/blob/main/DeepSeek_V3.pdf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;). This resulted in only &lt;strong&gt;5%&lt;/strong&gt; of the model's parameters being trained per-token, and around &lt;strong&gt;91% cheaper&lt;/strong&gt; cost to train than GPT 4 (GPT 4 costed $63 million to train (&lt;a href="https://team-gpt.com/blog/how-much-did-it-cost-to-train-gpt-4/" rel="noopener noreferrer"&gt;Source&lt;/a&gt;) and V3 costed $5.576 million to train. (&lt;a href="https://github.com/deepseek-ai/DeepSeek-V3/blob/main/DeepSeek_V3.pdf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;))&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Compression
&lt;/h3&gt;

&lt;p&gt;Under the hood, lots of key-value pairs are used. Storing all of them would allocate tons of resources. To fix this, DeepSeek uses Low-Rank Key-Value (KV) Joint Compression. It works by compressing them using a down-projection matrix. This compressed version is what's stored. When the data is needed, the technique is reversed to get back the original value, minimizing size, decreasing processing speed, and reducing memory usage. (&lt;a href="https://arxiv.org/pdf/2405.04434" rel="noopener noreferrer"&gt;Source&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl5otji6fqwdwwasc8beh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl5otji6fqwdwwasc8beh.png" alt="Different compression methods, with the main one, MLA powered by LRKVJ Compression." width="780" height="209"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Different compression methods, with the main one, MLA powered by LRKVJ Compression. &lt;a href="https://arxiv.org/pdf/2405.04434" rel="noopener noreferrer"&gt;Source&lt;/a&gt;
&lt;/center&gt;

&lt;h3&gt;
  
  
  4. Reinforced Learning
&lt;/h3&gt;

&lt;p&gt;Part of the way the model was trained is a lot like how you would train a dog.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The model was given complex, yet easy-to-validate questions to answer.&lt;/li&gt;
&lt;li&gt;If it answers correctly it's "rewarded", causing it to reinforce those patterns&lt;/li&gt;
&lt;li&gt;If it answers incorrectly, it adjusts itself in order to improve itself in future reiterations&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://arxiv.org/pdf/2405.04434" rel="noopener noreferrer"&gt;Source&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The Result:
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Training Costs&lt;/th&gt;
&lt;th&gt;Pre-Training&lt;/th&gt;
&lt;th&gt;Context Extension&lt;/th&gt;
&lt;th&gt;Post-Training&lt;/th&gt;
&lt;th&gt;Total&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;In H800s GPU Hours&lt;/td&gt;
&lt;td&gt;2664K&lt;/td&gt;
&lt;td&gt;119K&lt;/td&gt;
&lt;td&gt;5K&lt;/td&gt;
&lt;td&gt;2788K&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;in USD&lt;/td&gt;
&lt;td&gt;$5.328M&lt;/td&gt;
&lt;td&gt;$0.23M&lt;/td&gt;
&lt;td&gt;$0.01M&lt;/td&gt;
&lt;td&gt;$5.576M&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;(&lt;a href="https://github.com/deepseek-ai/DeepSeek-V3/blob/main/DeepSeek_V3.pdf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;)&lt;/p&gt;

&lt;h2&gt;
  
  
  How can I take advantage of it?
&lt;/h2&gt;

&lt;p&gt;DeepSeek's models are pretty easy to take advantage of. Here's how you can use them:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Online
&lt;/h3&gt;

&lt;p&gt;You can use DeepSeek V3 and R1 &lt;strong&gt;for free&lt;/strong&gt; &lt;a href="https://chat.deepseek.com/" rel="noopener noreferrer"&gt;on their official website&lt;/a&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  Warning: Any input or output done online will be stored and can be looked at by DeepSeek or the Chinese government
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4vfhnbvogrrtogftt40w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4vfhnbvogrrtogftt40w.png" alt="DeepSeek Chat Page" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;DeepSeek Chat Page&lt;/center&gt;

&lt;h2&gt;
  
  
  2. API
&lt;/h2&gt;

&lt;p&gt;DeepSeek have &lt;a href="https://api-docs.deepseek.com/" rel="noopener noreferrer"&gt;an offical API&lt;/a&gt; in case you don't want to self-host the models yourself, which is &lt;strong&gt;over 96% cheaper than OpenAI&lt;/strong&gt; (Calculated using &lt;a href="https://api-docs.deepseek.com/quick_start/pricing" rel="noopener noreferrer"&gt;R1's pricing&lt;/a&gt; input token cache miss, and comparing it to &lt;a href="https://openai.com/api/pricing/" rel="noopener noreferrer"&gt;o1's pricing&lt;/a&gt; input tokens parameter)&lt;/p&gt;

&lt;h3&gt;
  
  
  How to Use
&lt;/h3&gt;

&lt;p&gt;The API itself is pretty straightforward. You can use it with the OpenAI package on &lt;a href="https://npmjs.org/package/openai" rel="noopener noreferrer"&gt;NPM&lt;/a&gt; or &lt;a href="https://pypi.org/project/openai/" rel="noopener noreferrer"&gt;PIP&lt;/a&gt;, or make an HTTP Request. Note for this demo I will be using NodeJS. I will be working in an empty folder with an index.js file, and a package.json file.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;WARNING: &lt;u&gt;NEVER STORE API KEYS ON THE CLIENT-SIDE&lt;/u&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://platform.deepseek.com/api_keys" rel="noopener noreferrer"&gt;Apply for an API Key&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Download package
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;openai
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3.Make a request where you want to&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;OpenAI&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;openai&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;openai&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;OpenAI&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;baseURL&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://api.deepseek.com&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;&amp;lt;DeepSeek API Key&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;completion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;openai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;completions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt; 
            &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;system&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
            &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;You are a helpful assistant.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; 
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;What is 5 + 7?&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;deepseek-chat&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;completion&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;choices&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Run it
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;node index.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;To find the sum of 5 and 7, follow these steps:

Start with the first number: 
5

Add the second number to it:
5 + 7

Perform the addition:
5 + 7 = 12

Final Answer: 12
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Pretty easy isn't it?&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Locally
&lt;/h2&gt;

&lt;p&gt;Moving onto the fun stuff now, ✨ self-hosting ✨. Now unfortunately, the full model is around 400 GB. Most people don't have that much storage dedicated towards one model, and using it for your startup would be &lt;em&gt;extremely&lt;/em&gt; expensive. Luckily, there are distilled models; fine-tuned models, that are significantly smaller. The bigger the size, the smarter but slower. Let's first try running DeepSeek on our machine.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Download Ollama&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftjdx1ty7av7gud4sl123.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftjdx1ty7av7gud4sl123.png" alt="Ollama Homepage" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Ollama Homepage&lt;/center&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose the size you want to run locally.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h5&gt;
  
  
  Note: V3 only has one size
&lt;/h5&gt;

&lt;h5&gt;
  
  
  Note: as of Feb 2nd 2025
&lt;/h5&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;# of Parameters (in billions)&lt;/th&gt;
&lt;th&gt;Size (in GB)&lt;/th&gt;
&lt;th&gt;Model&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1.5B&lt;/td&gt;
&lt;td&gt;1.1GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:1.5b&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;7B&lt;/td&gt;
&lt;td&gt;4.7GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:7b&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;8B&lt;/td&gt;
&lt;td&gt;4.9GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:8b&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;14B&lt;/td&gt;
&lt;td&gt;9GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:14b&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;32B&lt;/td&gt;
&lt;td&gt;20GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:32b&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;70B&lt;/td&gt;
&lt;td&gt;43GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:70b&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;671B&lt;/td&gt;
&lt;td&gt;404GB&lt;/td&gt;
&lt;td&gt;deepseek-r1:671b or deepseek-v3&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  On your machine
&lt;/h2&gt;

&lt;p&gt;Open your terminal and run the command of the model chosen above. Once it has finished installing, you should see this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Famb4jtxsr023i459w0am.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Famb4jtxsr023i459w0am.png" alt="Prompt to enter message in Terminal" width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you can enter any prompt and get an answer. It even works offline!&lt;/p&gt;

&lt;h2&gt;
  
  
  In a project
&lt;/h2&gt;

&lt;p&gt;Ollama also has a package for &lt;a href="https://www.npmjs.com/package/ollama" rel="noopener noreferrer"&gt;NPM&lt;/a&gt; and &lt;a href="https://pypi.org/project/ollama/" rel="noopener noreferrer"&gt;PIP&lt;/a&gt;. Note that for this demo, I will be using an empty folder with a package.json and index.js file.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Download package
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;ollama
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Invoke
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;ollama&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ollama&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ollama&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;&amp;lt;MODEL&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;List 5 foods from Italy. Explain their origins&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Run it
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;node index.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;think&amp;gt;
Okay, so I need to list five Italian foods and explain their origins. Hmm, let's see... I'm not an expert on Italian cuisine, but I know a few basics. Maybe I can start by thinking about the most famous dishes and then research where they come from.

First, pasta comes to mind. I think pasta has been around for a long time, maybe even before Christ. I've heard that Marco Polo brought it back to Europe, but I'm not 
sure if he actually introduced it to Italy or if it was already there. I should probably look into that more. Also, pizza is a big one—everyone knows it's from Italy. 
But again, when exactly? I think the word "pizza" comes from "pie," which refers to a flatbread. Maybe bakers in Italy started making something like modern pizza centuries ago.

Lasagna is another dish I can think of. It's a layered pasta with fillings and béchamel sauce. I believe it has roots in Emilia-Romagna, but I'm not sure about the exact origins or when it was first made. Then there's risotto, which is a rice dish cooked with broth and other ingredients. I think it originated in Lombardy, maybe around the 17th century.

Gelato is Italian ice cream, right? I've heard that it has been around since the Middle Ages. Maybe Arab traders introduced sugar to Italy, and then gelato became popular as a sweet treat. So, sugar was key here, and the techniques for making gelato might have come from the Arab world.

Wait, am I missing any other major dishes? Oh, like tiramisu. That's a classic, but I'm not sure about its exact origin either. It seems to be a dessert from the Emilia-Romagna region as well. Maybe it was created in the 1800s or something. But since the user only asked for five foods, maybe that's enough.

Let me try to organize this:

1. Pasta: Originating before Christ, possibly brought to Italy by Marco Polo but already existing there.
2. Pizza: From ancient Rome, a flatbread that evolved into modern pizza in Campania.
3. Lasagna: From Emilia-Romagna, medieval dish with layers of pasta and fillings.
4. Risotto: Originating in Lombardy around the 17th century, using rice and broth.
5. Gelato: Middle Ages, influenced by Arab traders introducing sugar.

I think that's a good list. Each has its own history, but I should double-check some details to ensure accuracy. For example, pasta's history is a bit tricky because it was known before Marco Polo, but his travels popularized it in Europe. Also, gelato's origins might be tied to the invention of sugar, which came through trade with 
Arab countries.

I wonder if there are other foods I could have included instead. Maybe something like risotto is more specific, or perhaps arancini, which are stuffed pasta balls, but they're a variant of pasta too. Or maybe minestrone soup, but that's more of a category than a specific dish.

No, the five listed seem to cover a good range from different regions and time periods in Italy. I think that's sufficient for this task.
&amp;lt;/think&amp;gt;

Here is a list of five Italian foods along with their origins:

1. **Pasta**: Originating well before Christ, pasta has ancient roots and was likely already present in Italy when Marco Polo returned from his travels, where he popularized its use in Europe.

2. **Pizza**: The word "pizza" refers to a flatbread that traces its origins back to ancient Rome. Modern pizza as we know it evolved in Campania, particularly in areas like Napoli.

3. **Lasagna**: Hailing from the Emilia-Romagna region, lasagna is a medieval dish characterized by its layered structure of pasta interleaved with fillings and covered in béchamel sauce.

4. **Risotto**: Originating in Lombardy during the 17th century, risotto is a rice dish cooked with broth and various ingredients, known for its creamy texture.       

5. **Gelato**: This Italian ice cream has medieval roots, influenced by Arab traders who introduced sugar to Italy. Gelato's techniques have been passed down through generations, becoming a beloved treat.

This selection highlights the diverse culinary history of Italy, spanning regions and centuries.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Woah. That looks really weird doesn't it? The reason why is pretty simple. The response is in a format called markdown. We have three options to tackle this.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Embrace it
&lt;/h3&gt;

&lt;p&gt;Markdown is like a better version of plain text. In fact, this blog is being written in markdown to &lt;strong&gt;bold&lt;/strong&gt; or &lt;em&gt;italicize&lt;/em&gt; or &lt;del&gt;strikethrough&lt;/del&gt; text. If you want markdown, then that's that!&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Convert it to plain-text
&lt;/h3&gt;

&lt;p&gt;We can use a library called &lt;a href="https://www.npmjs.com/package/remove-markdown" rel="noopener noreferrer"&gt;remove-markdown&lt;/a&gt; in order to strip the markdown part of the text.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Download package
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;remove-markdown
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Update Code
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;ollama&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ollama&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;removeMd&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;remove-markdown&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ollama&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;&amp;lt;MODEL&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;List 5 foods from Italy. Explain their origins.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;removeMd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. Convert to HTML
&lt;/h3&gt;

&lt;p&gt;If you are trying to render this in the browser, we can use the &lt;a href="https://www.npmjs.com/package/marked" rel="noopener noreferrer"&gt;marked&lt;/a&gt; library in order to convert the markdown into HTML code.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Download Package
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install marked
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Update Code
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;ollama&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ollama&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;writeFileSync&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;parse&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;marked&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ollama&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;&amp;lt;MODEL&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;List 5 foods from Italy. Explain their origins.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="c1"&gt;// Optionally I am saving the response to an HTML file so I can view it in my browser&lt;/span&gt;
&lt;span class="nf"&gt;writeFileSync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;response.html&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;`
    &amp;lt;body&amp;gt;
    &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;)}&lt;/span&gt;&lt;span class="s2"&gt;
    &amp;lt;/body&amp;gt;
`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd20am6amrih8mea80pwb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd20am6amrih8mea80pwb.png" alt="response.html in Browser" width="800" height="256"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;response.html in Browser&lt;/center&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;DeepSeek is a very powerful new contender in the AI Industry. It has achieved revolutionary breakthroughs in the AI Industry that have never been done before. However with this, it has given us developers the opportunities to use AI in ways that were previously impossible. What will you build with it?&lt;/p&gt;

</description>
      <category>deepseek</category>
      <category>ai</category>
      <category>programming</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
