<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Tin Tin</title>
    <description>The latest articles on DEV Community by Tin Tin (@timluong).</description>
    <link>https://dev.to/timluong</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/timluong"/>
    <language>en</language>
    <item>
      <title>Installing Ollama Step by Step to run DeepSeek-R1 Offline</title>
      <dc:creator>Tin Tin</dc:creator>
      <pubDate>Fri, 31 Jan 2025 18:01:29 +0000</pubDate>
      <link>https://dev.to/timluong/installing-ollama-step-by-step-to-run-deepseek-r1-offline-5c6c</link>
      <guid>https://dev.to/timluong/installing-ollama-step-by-step-to-run-deepseek-r1-offline-5c6c</guid>
      <description>&lt;h2&gt;
  
  
  &lt;strong&gt;What is Ollama?&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Ollama&lt;/strong&gt; is a tool that lets you run &lt;strong&gt;large language models (LLMs)&lt;/strong&gt; like Llama 2, Mistral, or GPT-3 directly on your computer. Think of it as a "local ChatGPT" that doesn’t require an internet connection or cloud services. You can chat with AI, test ideas, or build apps without sharing data with third parties.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;Why Run Ollama in WSL (Windows Subsystem for Linux)?&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Most AI/ML tools (including Ollama) are optimized for &lt;strong&gt;Linux&lt;/strong&gt;. Here’s why WSL is ideal for Ollama on Windows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Native Linux Compatibility&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Ollama works best with Linux libraries and dependencies. WSL lets you run Linux commands and tools seamlessly on Windows, avoiding compatibility headaches.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Better Performance&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;WSL 2 (the latest version) integrates tightly with Windows while offering near-native Linux speeds. This is crucial for running resource-heavy LLMs smoothly.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GPU Support&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Modern LLMs work faster with GPUs. WSL 2 supports GPU passthrough (e.g., NVIDIA CUDA), letting Ollama leverage your computer’s graphics card for faster responses.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Easy Setup&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Installing Ollama in WSL is as simple as running a Linux command. No complex workarounds for Windows-specific issues.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Keep Your Windows Environment Clean&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Isolate Ollama’s dependencies (like Python packages) within WSL. No clutter in your main Windows system!&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Installing Ollama (The Engine)&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;💡&lt;strong&gt;Ollama&lt;/strong&gt; is the foundational component for running AI models.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Windows Users (WSL)&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  We'll leverage WSL to create a Linux environment within Windows.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Open a "&lt;code&gt;Terminal&lt;/code&gt;" or "&lt;code&gt;Command Prompt&lt;/code&gt;" and run it as &lt;strong&gt;administrator&lt;/strong&gt;. This is important for certain commands to function correctly.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwamqcc9hrffqkfxx3zqh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwamqcc9hrffqkfxx3zqh.png" alt="Run Windows Terminal" width="779" height="502"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Type &lt;code&gt;wsl --install&lt;/code&gt; and press Enter. This command will install the latest version of WSL with the Ubuntu distribution of Linux.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6knb2uqvjqtx4zz09aul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6knb2uqvjqtx4zz09aul.png" alt="2" width="575" height="129"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After WSL is installed, you will need to restart your computer.&lt;/li&gt;
&lt;li&gt;When you log back in, a new terminal window will open and continue the installation. You will be prompted to &lt;strong&gt;create a Linux username and password&lt;/strong&gt;. These are different from your Windows username and password.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faf11uvcj4kba8ylchnp9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faf11uvcj4kba8ylchnp9.png" alt="Image" width="800" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Your terminal is now running inside the Linux system.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbgipaahfilwmjodwkceq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbgipaahfilwmjodwkceq.png" alt="Image" width="773" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Upgrade your WSL environment to newest versions
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Type the following commands (one at a time, pressing Enter after each):

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;sudo apt update&lt;/code&gt; (This command refreshes the package lists (also known as the "package cache") from the software repositories that your system is configured to use. Think of it as updating your grocery store's price list, not buying the items themselves). You will be asked for your password to run the &lt;code&gt;sudo&lt;/code&gt; (root role).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0htznonzrpydal9adehe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0htznonzrpydal9adehe.png" alt="Image" width="800" height="507"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- `sudo apt upgrade -y` (This command upgrades all upgradable packages on your system to their newest versions, based on the information obtained by`apt update`. The `-y` automatically confirms any updates)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm2jme8phkw85muajm4na.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm2jme8phkw85muajm4na.png" alt="Image" width="800" height="277"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Install Ollama from WSL terminal
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;curl -fsSL [https://ollama.com/install.sh](https://ollama.com/install.sh) | sh&lt;/code&gt; for installing Ollama.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvm1zg0qk52y53oiy2bn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvm1zg0qk52y53oiy2bn.png" alt="Image" width="800" height="230"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;After Installation (All Users):&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;The Llama API service will be running in the background and is configured to listen on port 11434. This is the port which will allow other programs to access it. You can check by open your Browser and navigate to &lt;code&gt;http://localhost:11434&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj00ijb8th26s06igwgry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj00ijb8th26s06igwgry.png" alt="Image" width="746" height="163"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can download different AI models using the command line. For example, to get the &lt;code&gt;Deepseek-R1&lt;/code&gt; model, you would type &lt;code&gt;ollama pull deepseek-r1:8b&lt;/code&gt;. The downloaded model will be stored locally on your computer.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdgsp52003aftq655ahaj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdgsp52003aftq655ahaj.png" alt="Image" width="800" height="129"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can run the &lt;code&gt;Deepseek-R1&lt;/code&gt; model  using the command line, you should type &lt;code&gt;ollama run deepseek-r1:8b&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fet2w6pvkd9v2gpkw62fi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fet2w6pvkd9v2gpkw62fi.png" alt="Image" width="800" height="328"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Mac and Linux Users&lt;/strong&gt;
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Download the correct version of Llama for your operating system from llama.ai and follow the installation instructions provided on the website.&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>Kusto Query Language (KQL)</title>
      <dc:creator>Tin Tin</dc:creator>
      <pubDate>Thu, 30 Jan 2025 15:20:30 +0000</pubDate>
      <link>https://dev.to/timluong/kusto-query-language-kql-3f5c</link>
      <guid>https://dev.to/timluong/kusto-query-language-kql-3f5c</guid>
      <description>&lt;p&gt;&lt;strong&gt;Why this?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This post marks my first major attempt at daily learning, summarizing, and posting to a public space what I've consumed, so I can track my progress day by day.&lt;/p&gt;

&lt;p&gt;KQL came into my life in a rather jarring way. I saw a colleague named Kome at work doing fantastic things with something he called ADX, and I had absolutely zero knowledge of it. I had to overcome my embarrassment by starting to learn KQL and ADX in a very serious manner.&lt;/p&gt;

&lt;p&gt;It's been one year since then, but what I've consumed has been so fragmented, and it's really annoyed me. So, I've decided to re-learn it from the beginning with a clear and comprehensive structure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is Kusto Query Language?&lt;/strong&gt;&lt;br&gt;
Kusto Query Language (KQL) is a powerful query language used to explore, analyze, and visualize data stored in Azure Data Explorer, Application Insights, Log Analytics, and other services that support the Kusto engine. It’s optimized for querying large datasets with minimal latency.&lt;/p&gt;

&lt;p&gt;KQL allows you to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Extract insights from structured, semi-structured, and unstructured data.&lt;/li&gt;
&lt;li&gt;Perform complex aggregations, filtering, and transformations.&lt;/li&gt;
&lt;li&gt;Visualize data using charts, tables, and time-series graphs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Why Learn KQL?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Efficiency&lt;/strong&gt; : KQL is designed for high-performance querying on large datasets.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration&lt;/strong&gt; : It integrates seamlessly with Azure services like Azure Monitor, Azure Security Center, and more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ease of Use&lt;/strong&gt; : Its syntax is simple and SQL-like, making it accessible for those familiar with SQL or other query languages.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;2. Comparing KQL with Other Query Languages&lt;/strong&gt;
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;strong&gt;Feature&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;KQL&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;SQL&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Power Query (M)&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Python (Pandas)&lt;/strong&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Primary Use Case&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Analyzing logs, telemetry, and big data&lt;/td&gt;
&lt;td&gt;Relational databases&lt;/td&gt;
&lt;td&gt;Data transformation and ETL&lt;/td&gt;
&lt;td&gt;General-purpose programming and analysis&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Syntax&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Declarative, pipeline-based&lt;/td&gt;
&lt;td&gt;Declarative, set-based&lt;/td&gt;
&lt;td&gt;Functional, step-by-step&lt;/td&gt;
&lt;td&gt;Procedural, code-based&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Performance&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Optimized for large-scale analytics&lt;/td&gt;
&lt;td&gt;Optimized for transactional queries&lt;/td&gt;
&lt;td&gt;Moderate performance for small-to-medium datasets&lt;/td&gt;
&lt;td&gt;Flexible but slower for very large datasets&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Learning Curve&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Easy for SQL users&lt;/td&gt;
&lt;td&gt;Moderate&lt;/td&gt;
&lt;td&gt;Steeper due to functional paradigm&lt;/td&gt;
&lt;td&gt;Steepest due to general-purpose nature&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Visualization&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Built-in visualization tools (e.g., Azure)&lt;/td&gt;
&lt;td&gt;Requires external tools (e.g., Power BI)&lt;/td&gt;
&lt;td&gt;Integrated with Power BI&lt;/td&gt;
&lt;td&gt;Requires libraries like Matplotlib/Seaborn&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;3. Setting Up Your Environment&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To practice KQL, you’ll need access to a service that supports it. Here are some options:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Azure Data Explorer (ADX)&lt;/strong&gt; : A fully managed service for real-time analytics.

&lt;ul&gt;
&lt;li&gt;Create an ADX cluster and database in Azure Portal.&lt;/li&gt;
&lt;li&gt;Use the &lt;strong&gt;Kusto Explorer&lt;/strong&gt; desktop app or the &lt;strong&gt;Azure Portal&lt;/strong&gt; web interface.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Log Analytics&lt;/strong&gt; : Part of Azure Monitor.

&lt;ul&gt;
&lt;li&gt;Access via the Azure Portal under "Logs."&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Application Insights&lt;/strong&gt; : For monitoring application performance.

&lt;ul&gt;
&lt;li&gt;Access via the Azure Portal under "Logs."&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Free Sandbox&lt;/strong&gt; : Microsoft provides a free &lt;a href="https://aka.ms/LADemo" rel="noopener noreferrer"&gt;&lt;strong&gt;KQL sandbox&lt;/strong&gt;&lt;/a&gt; for learning purposes.&lt;/li&gt;
&lt;/ol&gt;

</description>
    </item>
  </channel>
</rss>
