<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Atikur Rabbi</title>
    <description>The latest articles on DEV Community by Atikur Rabbi (@atik).</description>
    <link>https://dev.to/atik</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/atik"/>
    <language>en</language>
    <item>
      <title>Run LLMs Completely Offline on Your Phone: A Practical Guide</title>
      <dc:creator>Atikur Rabbi</dc:creator>
      <pubDate>Thu, 04 Dec 2025 05:22:51 +0000</pubDate>
      <link>https://dev.to/atik/run-llms-completely-offline-on-your-phone-a-practical-guide-d9i</link>
      <guid>https://dev.to/atik/run-llms-completely-offline-on-your-phone-a-practical-guide-d9i</guid>
      <description>&lt;p&gt;Local LLMs on mobile are now a reality — thanks to powerful apps like &lt;strong&gt;Anything LLM&lt;/strong&gt;, you can run AI models offline directly on your smartphone. No cloud. No data sharing. Fully private.&lt;/p&gt;

&lt;p&gt;In this post, you'll learn:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What local mobile LLMs are&lt;/li&gt;
&lt;li&gt;Why offline AI is becoming popular&lt;/li&gt;
&lt;li&gt;How to install and use Anything LLM on your phone&lt;/li&gt;
&lt;li&gt;A short step-by-step setup tutorial&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🔗 GitHub Link
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Anything LLM Repository:&lt;/strong&gt; &lt;a href="https://github.com/Mintplex-Labs/anything-llm" rel="noopener noreferrer"&gt;https://github.com/Mintplex-Labs/anything-llm&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  📱 What is a Local LLM on Mobile?
&lt;/h2&gt;

&lt;p&gt;A &lt;strong&gt;Local LLM&lt;/strong&gt; is an AI model that runs directly on your device — not on a server. This means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;🚫 No internet required&lt;/li&gt;
&lt;li&gt;🔒 100% private&lt;/li&gt;
&lt;li&gt;⚡ Faster response time&lt;/li&gt;
&lt;li&gt;🆓 No API key or cost per request&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Mobile hardware is now powerful enough to run small-to-medium LLMs using on-device inference.&lt;/p&gt;




&lt;h2&gt;
  
  
  🎯 Why Use Offline Mobile AI?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Privacy:&lt;/strong&gt; Your chats never leave your device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;No API Limits:&lt;/strong&gt; Use the model as much as you want.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Speed:&lt;/strong&gt; Local inference avoids network delays.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Portability:&lt;/strong&gt; Use AI anywhere — even without network.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  📥 How to Install Anything LLM Mobile
&lt;/h2&gt;

&lt;p&gt;Anything LLM provides mobile app builds that allow you to run local models offline.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Download the Mobile App
&lt;/h3&gt;

&lt;p&gt;Visit the GitHub releases page:&lt;br&gt;
👉 &lt;a href="https://github.com/Mintplex-Labs/anything-llm/releases" rel="noopener noreferrer"&gt;https://github.com/Mintplex-Labs/anything-llm/releases&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Look for Android (.apk). iOS support may require TestFlight or sideloading depending on release.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Install a Local Model
&lt;/h3&gt;

&lt;p&gt;You can load:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;GGUF models&lt;/li&gt;
&lt;li&gt;Qwen/Qwen2.5&lt;/li&gt;
&lt;li&gt;Llama 3&lt;/li&gt;
&lt;li&gt;Mistral&lt;/li&gt;
&lt;li&gt;Phi models&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Choose a small model (1–4B) for best performance.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Load the Model in the App
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;strong&gt;Anything LLM Mobile&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Go to &lt;strong&gt;Local Model&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Import or download a GGUF file&lt;/li&gt;
&lt;li&gt;Start chatting offline&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  🚀 Example: Installing a 3B Model
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Download a model from HuggingFace (e.g., Qwen 1.8B GGUF)&lt;/li&gt;
&lt;li&gt;Place the model file on your phone&lt;/li&gt;
&lt;li&gt;Open Anything LLM → &lt;strong&gt;Add Model&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Select the downloaded file&lt;/li&gt;
&lt;li&gt;Begin chatting locally&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  🎉 Final Thoughts
&lt;/h2&gt;

&lt;p&gt;Running LLMs offline on your mobile unlocks &lt;strong&gt;true AI privacy and freedom&lt;/strong&gt;. Apps like Anything LLM are making local AI easier than ever.&lt;/p&gt;

&lt;p&gt;If you value privacy, control, and unlimited usage — on-device LLMs are the future.&lt;/p&gt;




</description>
      <category>ai</category>
      <category>privacy</category>
      <category>mobile</category>
      <category>llm</category>
    </item>
    <item>
      <title>Cursor Free Alternative: Open Source AI Editor using VScode</title>
      <dc:creator>Atikur Rabbi</dc:creator>
      <pubDate>Tue, 15 Oct 2024 07:32:46 +0000</pubDate>
      <link>https://dev.to/atik/cursor-free-alternative-open-source-ai-editor-1fak</link>
      <guid>https://dev.to/atik/cursor-free-alternative-open-source-ai-editor-1fak</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Cursor is a popular AI-powered IDE built on VS Code.  It offers impressive code generation and file management capabilities through prompts, allowing for updates to entire classes or functions with ease. However, it's a closed-source application with a subscription model, raising concerns about cost, enterprise usage limitations, and data security due to its reliance on proprietary servers.  This post explores a free, open-source alternative that leverages trusted APIs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Finding an Open-Source Alternative
&lt;/h2&gt;

&lt;p&gt;Initially, I explored the VS Code extension &lt;code&gt;Claude Dev&lt;/code&gt;, which offered similar functionality but only worked reliably with Anthropic's "claude-3-5-sonnet" model. Other models failed to perform file operations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cline: A Powerful Alternative
&lt;/h2&gt;

&lt;p&gt;Recently, &lt;code&gt;Claude Dev&lt;/code&gt; underwent a significant update, resulting in version 2.0, renamed "Cline".  This updated version supports various models and provides functionality comparable to Cursor. Let's explore how to set it up.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up Cline: A Step-by-Step Guide
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Download and Install Ollama&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to &lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;https://ollama.com/&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Click the "Download" button.&lt;/li&gt;
&lt;li&gt;Download the installer appropriate for your operating system (Windows, macOS, or Linux).&lt;/li&gt;
&lt;li&gt;Run the installer and follow the on-screen instructions to install Ollama.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Download a Language Model&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the Ollama application.&lt;/li&gt;
&lt;li&gt;In the Ollama terminal, use the command &lt;code&gt;ollama pull llama3.2&lt;/code&gt; to download the &lt;code&gt;llama3.2&lt;/code&gt; model.  (You can choose other models as well, depending on your preference and needs).&lt;/li&gt;
&lt;li&gt;Wait for the download and installation to complete. This may take some time depending on your internet connection and the model size.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Install the Cline VS Code Extension&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open VS Code.&lt;/li&gt;
&lt;li&gt;Go to the Extensions tab (usually found on the Activity Bar on the side).&lt;/li&gt;
&lt;li&gt;Search for "Cline".&lt;/li&gt;
&lt;li&gt;Install Cline 
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frprbnbbxs3ea1vc3pgud.png" alt="Image description" width="800" height="518"&gt;
&lt;/li&gt;
&lt;li&gt;Go to Cline setting. &lt;/li&gt;
&lt;li&gt;From Api provider Dropdown select Ollama. Select llama3.2 model save the setting by clicking Done button.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvq6p6rqng6gbcef6rhxz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvq6p6rqng6gbcef6rhxz.png" alt="Image description" width="415" height="558"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now the setup iss ready and tupe your prompt in input. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Cline provides a compelling open-source alternative to Cursor, offering similar AI-powered coding assistance without the cost and security concerns of a closed-source solution.  By leveraging Ollama and a suitable language model, you can enjoy the benefits of AI-assisted coding in a more transparent and secure environment.&lt;/p&gt;

&lt;p&gt;Usefull Links Links:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Ollama&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/cline/cline" rel="noopener noreferrer"&gt;Cline&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>cursor</category>
      <category>llama</category>
      <category>ai</category>
      <category>vscode</category>
    </item>
    <item>
      <title>Deploying Spring For Free  Using Glitch</title>
      <dc:creator>Atikur Rabbi</dc:creator>
      <pubDate>Sun, 05 Sep 2021 16:33:32 +0000</pubDate>
      <link>https://dev.to/atik/deploying-spring-for-free-using-glitch-472m</link>
      <guid>https://dev.to/atik/deploying-spring-for-free-using-glitch-472m</guid>
      <description>&lt;p&gt;Spring is widely used as JAVA based application framework. It is reliable and lightweight. It is perfect not only for large application but also for our weekend project. Sometimes we need to take our apps live. Unlike node very few free hosting is available for spring. Among them Heroku is most popular. &lt;/p&gt;

&lt;p&gt;I like to share one more free hosting option for your spring project. Glitch is providing their free service and we are happy to have a service like this. Many node and frontend developer using their free forever hosting services. Since they support maven I searched for spring project. I failed to found any so I tried to configure one. &lt;/p&gt;

&lt;h2&gt;
  
  
  Method 1 :
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Step 1:
&lt;/h3&gt;

&lt;p&gt;Create your spring project with &lt;a href="https://start.spring.io" rel="noopener noreferrer"&gt;Spring Initializr&lt;/a&gt; or any other tools you prefer&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2:
&lt;/h3&gt;

&lt;p&gt;Create a file called glitch.json in your root folder&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "start": " java -jar target/*.jar",
  "install": "echo 'compiling' &amp;amp;&amp;amp; mvn package",
  "watch": {
    "throttle": 500,
    "install": {
      "include": [
        "^pom\\.xml$",
        "\\.java$"
      ]
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3:
&lt;/h3&gt;

&lt;p&gt;Upload the project to GitHub and import the project to glitch using git URL. Directly upload the project using glitch file upload is also possible.&lt;/p&gt;

&lt;h2&gt;
  
  
  Method 2 :
&lt;/h2&gt;

&lt;p&gt;For quick start you can use the remix button / fork button below and customize as you need.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://glitch.com/edit/#!/remix/https://glitch.com/edit/#!/springboot" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.glitch.com%2F2703baf2-b643-4da7-ab91-7ee2a2d00b5b%252Fremix-button-v2.svg" alt="Remix on Glitch"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/atikur-rabbi/spring/fork" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzqrd9zx2gf8rj1rpkz94.png" alt="Fork"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In case you liked this article, you can give a Heart. I would love to hear any feedback and review from you. Please tell me what you liked in the comment section below. Happy Learning!✨&lt;/p&gt;

</description>
      <category>spring</category>
      <category>cloud</category>
      <category>glitch</category>
      <category>java</category>
    </item>
  </channel>
</rss>
