<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Gustavo M.</title>
    <description>The latest articles on DEV Community by Gustavo M. (@hola_gus).</description>
    <link>https://dev.to/hola_gus</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/hola_gus"/>
    <language>en</language>
    <item>
      <title>How to Use MCP Servers With Ollama and Local LLMs</title>
      <dc:creator>Gustavo M.</dc:creator>
      <pubDate>Mon, 04 May 2026 22:16:00 +0000</pubDate>
      <link>https://dev.to/hola_gus/how-to-use-mcp-servers-with-ollama-and-local-llms-f3</link>
      <guid>https://dev.to/hola_gus/how-to-use-mcp-servers-with-ollama-and-local-llms-f3</guid>
      <description>&lt;p&gt;Ollama makes it easy to run open-weight models locally, but it does not ship an MCP client. The MCP protocol is handled at the client layer, not inside the LLM itself. To use MCP servers with a local Ollama model, you need a bridge that speaks MCP on one side and the Ollama API on the other. MCPFind indexes 832 servers in the &lt;a href="https://dev.to/categories/ai-ml"&gt;ai-ml category&lt;/a&gt;, averaging 114.27 stars per server, the highest average across all categories. Most of those servers are designed for cloud-hosted clients, but a meaningful subset runs well offline. This guide covers the bridge setup, model selection, and which server categories give the best results on local hardware.&lt;/p&gt;

&lt;h2&gt;
  
  
  Does Ollama Support MCP Natively?
&lt;/h2&gt;

&lt;p&gt;Ollama does not implement the MCP protocol. It exposes an OpenAI-compatible REST API with a &lt;code&gt;/api/chat&lt;/code&gt; endpoint and a &lt;code&gt;tools&lt;/code&gt; parameter that mirrors OpenAI's function-calling format. When you pass tools to Ollama, it generates JSON tool calls in the response that your client code must parse and dispatch. That dispatch step is what an MCP client handles. The MCP protocol adds session management, capability negotiation, and a richer tool schema on top of the basic function-calling concept. Because Ollama's tool-calling format is compatible with the OpenAI spec, any MCP bridge that already supports OpenAI backends can usually target Ollama by pointing the base URL at &lt;code&gt;http://localhost:11434/v1&lt;/code&gt;. The gap is session lifecycle and streaming, not the tool schema format itself.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Bridge Ollama With MCP Using MCPHost
&lt;/h2&gt;

&lt;p&gt;MCPHost is a Go CLI that acts as an MCP client with a configurable LLM backend. Install it with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;go &lt;span class="nb"&gt;install &lt;/span&gt;github.com/mark3labs/mcphost@latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Configure it with a JSON file that lists your MCP servers and sets Ollama as the backend:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"filesystem"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"npx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"-y"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"@modelcontextprotocol/server-filesystem"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/Users/you/projects"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run it pointing at your local model:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;mcphost &lt;span class="nt"&gt;--config&lt;/span&gt; mcp-config.json &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--model&lt;/span&gt; ollama:qwen2.5:14b &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--ollama-url&lt;/span&gt; http://localhost:11434
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;MCPHost starts each server listed in the config, negotiates capabilities, and passes available tools to the model on each turn. The model's tool call responses are routed back to the correct MCP server. Use a model that supports tool calling: Qwen2.5, Llama 3.3, Mistral Nemo, and Gemma 3 are reliable options. Smaller quantizations (Q4_K_M and below) sometimes produce malformed JSON tool calls, so test before deploying.&lt;/p&gt;

&lt;h2&gt;
  
  
  Which MCP Servers Work Best With Local LLMs?
&lt;/h2&gt;

&lt;p&gt;Not all servers are equally useful offline. We grouped the &lt;a href="https://dev.to/categories/devtools"&gt;devtools category&lt;/a&gt; and the ai-ml category by how much benefit they provide in a local context. Filesystem and code tools perform well because they do not need an external API; the model reads files and the server reports results. The official filesystem, Git, and SQLite reference servers from Anthropic all fit this pattern. Search and web browsing servers still make external HTTP requests, but the LLM component runs locally, so they work as long as you have internet access. The highest-friction category is cloud services: servers for AWS, GitHub, and similar platforms require valid API credentials regardless of whether the LLM is local or remote. For pure offline use, stick to filesystem, database, and local code tools. The &lt;a href="https://dev.to/categories/ai-ml"&gt;ai-ml category on MCPFind&lt;/a&gt; includes several model management servers worth exploring if you are running a multi-model setup.&lt;/p&gt;

&lt;h2&gt;
  
  
  Limitations and When to Use a Cloud Client Instead
&lt;/h2&gt;

&lt;p&gt;Three constraints matter when running MCP with Ollama. First, tool-calling reliability: open-weight models produce malformed tool calls more often than frontier models. Complex tool schemas with nested objects or many optional fields are particularly prone to errors. Simplify your tool definitions when building for local use. Second, context length: long MCP tool responses eat into the model's context window. A server that returns a 10,000-token file listing may cause later turns to lose earlier context. Truncate large responses at the server level before returning them. Third, latency: a tool call adds one round trip, so multi-step agentic tasks accumulate latency. On consumer hardware, expect 2-10 seconds per tool call depending on model size and quantization. If your workflow requires frequent multi-tool calls or long responses, a Claude-backed client with a subset of the same MCP servers will be faster. Start with &lt;a href="https://dev.to/blog/what-is-mcp"&gt;what MCP is&lt;/a&gt; if you want the protocol background before optimizing your setup.&lt;/p&gt;

&lt;h2&gt;
  
  
  Frequently Asked Questions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Does Ollama support MCP natively?
&lt;/h3&gt;

&lt;p&gt;No. Ollama exposes an OpenAI-compatible API with tool-calling support, but it does not implement the MCP protocol. You need a bridge like MCPHost to connect MCP servers to Ollama.&lt;/p&gt;

&lt;h3&gt;
  
  
  Which Ollama models support tool calling?
&lt;/h3&gt;

&lt;p&gt;Qwen2.5, Llama 3.1/3.2/3.3, Mistral Nemo, Command-R, and Gemma 3 all support tool calling through Ollama's API. Not all quantizations behave identically, so test with the exact model tag you plan to deploy.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can I run MCP servers that require internet access with Ollama?
&lt;/h3&gt;

&lt;p&gt;Yes, the local model and the MCP server are separate processes. A server that fetches web data or calls an API will still make those external requests; only the LLM component runs locally.&lt;/p&gt;

&lt;h3&gt;
  
  
  Is MCPHost the only way to use MCP with Ollama?
&lt;/h3&gt;

&lt;p&gt;No. Any MCP client that supports configurable backends works, including Continue.dev and custom wrappers built with the official MCP SDK. MCPHost is the simplest standalone option for CLI workflows.&lt;/p&gt;




&lt;p&gt;For background on how MCP compares to other tool-calling approaches, see &lt;a href="https://dev.to/blog/mcp-vs-function-calling"&gt;MCP vs function calling&lt;/a&gt;. If you want to build your own MCP server to pair with a local model, the &lt;a href="https://dev.to/blog/build-mcp-server-typescript"&gt;TypeScript MCP server guide&lt;/a&gt; covers the full setup.&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>ai</category>
      <category>mcp</category>
    </item>
    <item>
      <title>Set Up Your First MCP Server in Claude Desktop (Step-by-Step 2026 Guide)</title>
      <dc:creator>Gustavo M.</dc:creator>
      <pubDate>Mon, 04 May 2026 16:17:16 +0000</pubDate>
      <link>https://dev.to/hola_gus/set-up-your-first-mcp-server-in-claude-desktop-step-by-step-2026-guide-28nk</link>
      <guid>https://dev.to/hola_gus/set-up-your-first-mcp-server-in-claude-desktop-step-by-step-2026-guide-28nk</guid>
      <description>&lt;p&gt;If you have Claude Desktop installed and want to make it dramatically more useful, adding MCP servers is the fastest path. You do not need to write code or understand how the protocol works under the hood.&lt;/p&gt;

&lt;p&gt;This guide covers picking your first server, adding it to Claude Desktop, and confirming it works. The whole process takes about 15 minutes. If you want to understand what MCP servers are before diving in, read &lt;a href="https://dev.to/blog/what-are-mcp-servers"&gt;MCP Servers Explained&lt;/a&gt; first. For background on the protocol itself, see &lt;a href="https://dev.to/blog/what-is-mcp"&gt;What Is the Model Context Protocol&lt;/a&gt;. If you use Cursor instead of Claude Desktop, &lt;a href="https://dev.to/blog/how-to-use-mcp-with-cursor"&gt;how to configure MCP in Cursor&lt;/a&gt; covers the same process with Cursor-specific config details.&lt;/p&gt;

&lt;h2&gt;
  
  
  What You Need Before You Start
&lt;/h2&gt;

&lt;p&gt;You need Claude Desktop installed and a terminal available on your machine. That is it.&lt;/p&gt;

&lt;p&gt;Claude Desktop is the free desktop app from Anthropic available for macOS and Windows. If you do not have it yet, download it from claude.ai/download. The terminal is built into every Mac (search "Terminal" in Spotlight) and available on Windows via PowerShell or Command Prompt. You will use the terminal to install server packages, but the commands are simple copy-paste steps - nothing to memorize.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Pick Your First MCP Server
&lt;/h2&gt;

&lt;p&gt;Pick one server to start. Trying to add five at once makes it harder to debug if something goes wrong. The goal is to learn how to use MCP with Claude on a single working example, then expand from there.&lt;/p&gt;

&lt;p&gt;A good first choice for non-developers is a search server, which lets Claude pull live web results into your conversation. MCPFind indexes 416 servers in the &lt;a href="https://dev.to/categories/search"&gt;search category&lt;/a&gt;. Another strong beginner pick is a filesystem server, which gives Claude read access to a folder on your computer - useful for asking questions about documents you have saved locally. For something more powerful, the &lt;a href="https://dev.to/servers/com-supabase-mcp"&gt;Supabase MCP server&lt;/a&gt; has 2,556 GitHub stars and solid documentation. Browse &lt;a href="https://dev.to/categories/devtools"&gt;/categories/devtools&lt;/a&gt; if you want to see the full range of developer-focused options, or &lt;a href="https://dev.to/categories/databases"&gt;/categories/databases&lt;/a&gt; for data-focused servers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2 - Find the Install Config on MCPFind
&lt;/h2&gt;

&lt;p&gt;Every server page on MCPFind shows a ready-to-copy configuration block for Claude Desktop.&lt;/p&gt;

&lt;p&gt;Navigate to the server you chose and look for the "Claude Desktop" tab in the install section. You will see a JSON snippet that looks something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"server-name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"npx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"-y"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"@scope/server-package"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy that entire block. You will paste it into your config file in the next step.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3 - Edit the Claude Desktop Config File
&lt;/h2&gt;

&lt;p&gt;Open your Claude Desktop config file in any text editor. On macOS, the path is &lt;code&gt;~/Library/Application Support/Claude/claude_desktop_config.json&lt;/code&gt;. On Windows, it is &lt;code&gt;%APPDATA%\Claude\claude_desktop_config.json&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;If the file does not exist yet, create it. Paste the JSON snippet you copied from MCPFind. If you already have other servers in your config, add the new server as another entry inside the existing &lt;code&gt;mcpServers&lt;/code&gt; object - do not replace the whole file. Save the file when you are done.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4 - Restart Claude Desktop and Test
&lt;/h2&gt;

&lt;p&gt;Close Claude Desktop completely and reopen it. The app reads the config file on startup, so a restart is required.&lt;/p&gt;

&lt;p&gt;Once it reopens, start a new conversation and ask Claude to use the server you just added. For a search server, try: "Search the web for the latest MCP server releases." For a filesystem server, try: "List the files in my Documents folder." If Claude uses the tool and returns a result, the connection is working. If you see an error, double-check that your JSON is valid - a missing comma or bracket is the most common issue.&lt;/p&gt;

&lt;h2&gt;
  
  
  Frequently Asked Questions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Does Claude Desktop support MCP out of the box?
&lt;/h3&gt;

&lt;p&gt;Yes. Claude Desktop has built-in MCP support. You just need to edit its configuration file to add the servers you want. No extensions or plugins are required.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can I run multiple MCP servers at the same time in Claude Desktop?
&lt;/h3&gt;

&lt;p&gt;Yes. Claude Desktop supports multiple simultaneous MCP server connections. You add each server as a separate entry in the mcpServers section of your config file, and Claude can use all of them in the same conversation.&lt;/p&gt;

&lt;h3&gt;
  
  
  What happens if an MCP server stops working?
&lt;/h3&gt;

&lt;p&gt;Claude Desktop will show an error if it cannot connect to a configured server. Your other servers and Claude itself continue working normally. You can remove or fix the broken entry in your config file without affecting anything else.&lt;/p&gt;

&lt;h3&gt;
  
  
  Do MCP servers have access to my entire computer?
&lt;/h3&gt;

&lt;p&gt;Only what the server is designed to access. A filesystem MCP server, for example, typically limits access to a specific directory you configure. Always read the server documentation to understand exactly what it can and cannot reach.&lt;/p&gt;

&lt;h3&gt;
  
  
  Where is the Claude Desktop config file located?
&lt;/h3&gt;

&lt;p&gt;On macOS it is at ~/Library/Application Support/Claude/claude_desktop_config.json. On Windows it is at %APPDATA%\Claude\claude_desktop_config.json. Create the file if it does not exist.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>mcp</category>
      <category>claude</category>
    </item>
  </channel>
</rss>
