<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: NovaNest</title>
    <description>The latest articles on DEV Community by NovaNest (@novanest_82b6db17c07b068f).</description>
    <link>https://dev.to/novanest_82b6db17c07b068f</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/novanest_82b6db17c07b068f"/>
    <language>en</language>
    <item>
      <title>Forget Cloud AI — Build Your Own Private Chat App with Web Context Using Ollama</title>
      <dc:creator>NovaNest</dc:creator>
      <pubDate>Sun, 29 Jun 2025 06:34:18 +0000</pubDate>
      <link>https://dev.to/novanest_82b6db17c07b068f/forget-cloud-ai-build-your-own-private-chat-app-with-web-context-using-ollama-36hj</link>
      <guid>https://dev.to/novanest_82b6db17c07b068f/forget-cloud-ai-build-your-own-private-chat-app-with-web-context-using-ollama-36hj</guid>
      <description>&lt;h3&gt;
  
  
  &lt;strong&gt;Why You Should Have Your Own Local LLM&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The AI revolution is hard to ignore — chatbots, coding assistants, AI writing tools are everywhere. But behind that convenience often lies a hidden cost: &lt;strong&gt;your data&lt;/strong&gt;, your privacy, and your control.&lt;/p&gt;

&lt;p&gt;Most popular AI tools rely on cloud-based large language models (LLMs) running on servers you don’t own, with prompts and data sent over the internet. Even when companies promise security, the reality is simple: &lt;strong&gt;you don't fully control what happens to your inputs or the models generating your outputs.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;That’s where &lt;strong&gt;local LLMs&lt;/strong&gt; change the game.&lt;/p&gt;

&lt;p&gt;Thanks to projects like &lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Ollama&lt;/a&gt; and the rise of efficient, open-source models like &lt;strong&gt;LLaMA&lt;/strong&gt;, &lt;strong&gt;Mistral&lt;/strong&gt;, and others, it’s now possible to run capable AI models &lt;strong&gt;entirely on your own machine&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Why Should You Care About Local AI?&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Full Privacy:&lt;/strong&gt; Your conversations, data, and context never leave your device.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;No API Costs:&lt;/strong&gt; Forget subscriptions or API limits.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Offline Friendly:&lt;/strong&gt; Your AI assistant works even without internet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tinker &amp;amp; Learn:&lt;/strong&gt; Dive into how prompts, context, and LLMs really work under the hood.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Total Control:&lt;/strong&gt; You decide what models to use, how they're updated, and how your data flows.&lt;/p&gt;

&lt;p&gt;In this post, I’ll show you how I built &lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; — a simple, private chat app that combines a local LLM with web context you control — and how you can do the same.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;What is ContextChat? — Project Overview&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Imagine having your own AI assistant — one that not only chats with you, but also understands the web pages you care about — all without sending a single byte of data to the cloud.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;That’s exactly what &lt;strong&gt;ContextChat&lt;/strong&gt; does.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;In Simple Terms:&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; is a fully local, open-source AI chat application that enhances your conversations with context from web pages you choose. It runs entirely on your machine, powered by open-source LLMs like LLaMA or Mistral through &lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;&lt;strong&gt;Ollama&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;But unlike a basic chat app, it doesn’t just rely on the model’s static knowledge. You can add URLs, and &lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; automatically extracts information from those pages, injecting that into your chat session as live, dynamic context.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Who is it For?&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Developers exploring AI tooling&lt;/li&gt;
&lt;li&gt;Privacy-conscious users avoiding cloud AI&lt;/li&gt;
&lt;li&gt;Researchers experimenting with context-aware LLMs&lt;/li&gt;
&lt;li&gt;Anyone curious about building practical, local AI apps&lt;/li&gt;
&lt;li&gt;Tinkerers who want to fork, extend, or modify their own private AI assistant&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Key Features at a Glance:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Desktop chat app with clean, simple UI&lt;/li&gt;
&lt;li&gt;Runs a local server to manage chat, context, and LLM interaction&lt;/li&gt;
&lt;li&gt;Lets you add web pages as context sources&lt;/li&gt;
&lt;li&gt;Sends combined prompts to your locally running LLM&lt;/li&gt;
&lt;li&gt;Fully private — no internet required after initial setup&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Open source — anyone can fork, modify, and build on top of it&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In short, &lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; lets you experience a more useful, privacy-first AI assistant — without giving up control of your data or relying on external APIs — and gives you the source code to make it your own.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Under the Hood: Architecture Breakdown&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Building a private AI chat app with web context sounds complex, but the core system is designed to be simple, modular, and entirely local.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Here’s how &lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; works behind the scenes.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;The Core Components&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The project is divided into three main parts:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;GUI Chat App (Tkinter, Python)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A minimal desktop interface for sending messages and viewing responses. Built with Tkinter for quick prototyping and cross-platform compatibility.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;MCP Server (FastAPI, Python)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Message/Context/Prompt (MCP) server handles the logic:&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- Manages conversation history
- Stores added URLs
- Crawls web pages for content
- Assembles the final prompt with context
- Sends it to the local LLM via Ollama
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;LLM Inference (Ollama + GGUF Models)&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Ollama runs a chosen open-source LLM entirely on your machine&lt;/li&gt;
&lt;li&gt;Supports models like LLaMA, Mistral, and others in efficient GGUF format&lt;/li&gt;
&lt;li&gt;Processes the prompt and returns the AI-generated response&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;The Data Flow&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;A typical interaction looks like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You type a message in the chat GUI&lt;/li&gt;
&lt;li&gt;The GUI sends your message to the MCP server&lt;/li&gt;
&lt;li&gt;MCP gathers relevant context:

&lt;ul&gt;
&lt;li&gt;Recent chat history&lt;/li&gt;
&lt;li&gt;Any added web page content&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;MCP combines everything into a single prompt&lt;/li&gt;
&lt;li&gt;The prompt is sent to your local Ollama LLM&lt;/li&gt;
&lt;li&gt;The LLM generates a response, returned via MCP to the GUI&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;All of this happens &lt;strong&gt;locally&lt;/strong&gt;. No external APIs, no cloud dependencies — giving you full control over your data and AI interaction.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step-by-Step: Setting Up Your Own ContextChat&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;One of the biggest advantages of ContextChat is how easy it is to get started. In just a few steps, you can have a private, local AI chat app running on your own machine.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Prerequisites&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Any Computer&lt;/li&gt;
&lt;li&gt;Python 3.9 or newer installed&lt;/li&gt;
&lt;li&gt;Basic familiarity with terminal or command prompt&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; The project is primarily tested on Mac and Linux. Windows works, but some steps may vary slightly depending on your setup.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 1: Install Ollama&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Ollama handles running the local LLM efficiently. To install it, follow the official instructions: &lt;a href="https://ollama.com/download" rel="noopener noreferrer"&gt;https://ollama.com/download&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For Linux users, here’s a quick example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://ollama.com/install.sh | sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once installed, pull your preferred LLM model (e.g., Mistral or LLaMA):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama serve
ollama pull mistral
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;ollama serve&lt;/code&gt; command starts the Ollama service, making the LLM available for local apps like ContextChat.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 2: Set Up the MCP Server&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The MCP Server handles context gathering, prompt construction, and communication with Ollama.&lt;/p&gt;

&lt;p&gt;Navigate to the &lt;code&gt;mcp_server&lt;/code&gt; directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;contextchat/mcp_server
pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
uvicorn main:app &lt;span class="nt"&gt;--reload&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;For Windows users:&lt;/strong&gt; You can use Command Prompt or PowerShell for the same commands. If &lt;code&gt;uvicorn&lt;/code&gt; isn't recognized, ensure your Pythn Scripts folder is added to your PATH.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 3: Run the GUI Chat App&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The chat interface lives in the &lt;code&gt;gui_app&lt;/code&gt; folder:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;contextchat/gui_app
pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
python app.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;For Windows users:&lt;/strong&gt; Same commands apply in Command Prompt or PowerShell.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Troubleshooting Tips&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Make sure &lt;code&gt;ollama serve&lt;/code&gt; is running before starting ContextChat.&lt;/li&gt;
&lt;li&gt;If you run into missing package errors, double-check that all &lt;code&gt;requirements.txt&lt;/code&gt; dependencies are installed.&lt;/li&gt;
&lt;li&gt;Windows users: You may need to adjust environment variables or use &lt;code&gt;python&lt;/code&gt; vs. &lt;code&gt;python3&lt;/code&gt; depending on your setup.&lt;/li&gt;
&lt;li&gt;The project is still evolving, so Windows-specific bugs may occur.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;The ContextChat Approach: Local, Dynamic Knowledge&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; enhances your AI assistant by injecting real-time context from web pages you choose — entirely on your device.&lt;/p&gt;

&lt;p&gt;Here’s how it works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You add URLs via the chat interface&lt;/li&gt;
&lt;li&gt;The MCP server fetches and extracts text content from those pages&lt;/li&gt;
&lt;li&gt;When you send a message, the extracted content is combined with your prompt&lt;/li&gt;
&lt;li&gt;This richer, context-aware prompt goes to your local LLM via Ollama&lt;/li&gt;
&lt;li&gt;The response reflects both your query and the added knowledge&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;All of this happens &lt;strong&gt;without your data or browsing activity leaving your machine&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Why It Matters&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Get AI responses tailored to your chosen information sources&lt;/li&gt;
&lt;li&gt;Explore new ways of making LLMs truly useful in your workflow&lt;/li&gt;
&lt;li&gt;Maintain complete privacy and control — no third-party servers involved&lt;/li&gt;
&lt;li&gt;Pave the way for more advanced local AI use cases, like document summarization or research assistants&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Future Possibilities&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;This system lays the groundwork for even more powerful features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ingesting PDF or text documents as context&lt;/li&gt;
&lt;li&gt;More advanced web crawlers that handle JavaScript-heavy pages&lt;/li&gt;
&lt;li&gt;Real-time context updates during conversations&lt;/li&gt;
&lt;li&gt;Smarter context filtering to avoid overwhelming the LLM&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Future Development &amp;amp; How You Can Contribute&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;&lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;ContextChat&lt;/a&gt; is just getting started. While the current version offers a fully local, privacy-respecting AI chat experience with web context, there’s plenty of room to grow.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Here’s what’s planned — and how you can be part of it.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Planned Features&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The following improvements are on the roadmap:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Show Added URLs in the GUI:&lt;/strong&gt; So you can see, manage, and review your context sources at a glance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reset Context with One Click:&lt;/strong&gt; A simple button to clear conversation history and added URLs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Save and Load Chat History:&lt;/strong&gt; Preserve your conversations across sessions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Visual Theme Improvements:&lt;/strong&gt; A more polished, user-friendly interface with better layouts and fonts.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Streaming AI Responses:&lt;/strong&gt; See responses appear in real-time for a more natural chat feel.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Modern GUI Options:&lt;/strong&gt; Exploring frameworks like Flet or PyQt to enhance the desktop app experience.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Document Ingestion:&lt;/strong&gt; Add PDFs or text files as context, not just web pages.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Advanced Web Crawler:&lt;/strong&gt; Better handling of complex websites, including JavaScript-rendered content.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Standalone Desktop Builds:&lt;/strong&gt; Easy installers for Mac and Linux without requiring manual setup.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;How You Can Contribute&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; is open source — designed to evolve with community feedback and contributions.&lt;/p&gt;

&lt;p&gt;Ways to get involved:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fork the project and experiment with your own improvements&lt;/li&gt;
&lt;li&gt;Submit pull requests for new features or bug fixes&lt;/li&gt;
&lt;li&gt;Report issues or suggest ideas on the GitHub repository&lt;/li&gt;
&lt;li&gt;Share feedback if you’re using it in your workflow&lt;/li&gt;
&lt;li&gt;Help test on Windows or other environments&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The vision is to build a practical, privacy-first AI toolkit that anyone can use and extend — without compromising control or security.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Conclusion&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Cloud AI tools offer convenience — but at the price of privacy and control. With open-source projects like ContextChat, you no longer have to make that trade-off.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;By running your own AI chat app locally, enhanced with web context you choose, you gain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Complete privacy — your prompts, data, and browsing never leave your device&lt;/li&gt;
&lt;li&gt;Total control over what models you use and how your assistant behaves&lt;/li&gt;
&lt;li&gt;The ability to experiment, extend, and build on an open-source foundation&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://github.com/NovaNestApps/ContextChat" rel="noopener noreferrer"&gt;&lt;strong&gt;ContextChat&lt;/strong&gt;&lt;/a&gt; is just the beginning. Whether you're a developer, researcher, or simply curious about private AI, this project shows what’s possible — and how easy it is to get started.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>python</category>
      <category>mcp</category>
    </item>
  </channel>
</rss>
