DEV Community

Cover image for Run a decentralized MCP application with Gaia
Tobiloba Adedeji
Tobiloba Adedeji

Posted on

Run a decentralized MCP application with Gaia

Decentralized AI is becoming a reality, allowing developers to run powerful AI agents on their own hardware and extend them with live data.

In this guide, we’ll show you how to run a decentralized application using a Gaia node with MCP (Model Context Protocol) – specifically, we’ll wire up a Gaia AI agent to a weather service. We’ll explain the basics of Gaia and MCP, walk through setting up a Gaia node with MCP support, configure a weather MCP server (using OpenWeather data), and then launch everything to get real-time weather responses from your AI.

By the end, you’ll have a local AI agent that can fetch live weather info on command – all using decentralized, open-source tech.

Preview tweet below:

What Are Gaia and MCP? (And Why Decentralization Matters)

Gaia is a decentralized computing infrastructure that lets anyone deploy and run their own AI agent node.

Think of a Gaia node as your personal AI assistant running on your machine (or in the cloud) – it comes with a fine-tuned language model, optional private knowledge base, and a web/chat API. Gaia nodes can even serve as drop-in replacements for OpenAI’s API. Because each node is self-hosted and networked, Gaia enables an open, decentralized AI network where individuals control their AI’s data, behavior, and how it serves others, rather than relying on a centralized provider.

MCP (Model Context Protocol) is an emerging open standard for connecting AI models to external tools and data. Introduced by Anthropic in late 2024, MCP acts like a “USB-C for AI” – one universal interface for many integrations. It’s a bi-directional, stateful protocol that allows an AI (or its host application) to call out to external services in a standardized way. Each external capability is provided by an MCP server advertising certain “tools” or functions it can perform (for example: fetch weather data, run a database query, etc.). The AI’s host (in our case, the Gaia node) includes an MCP client that knows how to communicate with these servers. Instead of writing custom code for every API an AI might use, MCP lets us plug in new tools as easily as plugging in a device – greatly simplifying and standardizing AI integrations.

Why is decentralization important here? In a traditional setup, if your AI needs real-time info (weather, stock prices, etc.), you might rely on a specific platform’s plugins or API calls, often tied to a single company’s ecosystem. With Gaia and MCP, everything is open and configurable. You run the AI node, you choose or run the MCP servers for tools, and you aren’t locked into one vendor’s infrastructure. This means more control, privacy, and innovation – anyone can create new MCP-compliant tools and any AI client can use them.

Gaia’s vision is often described as “YouTube for knowledge and skills” where individuals can deploy and monetize AI agents with their own expertise. By leveraging MCP, those agents can safely access live data or perform actions without hardcoding each integration. Decentralized applications built on Gaia + MCP empower developers to create rich AI services that they fully own and control, while tapping into a growing ecosystem of open tools.

Setting Up Gaia with MCP Support

First, let’s get a Gaia node up and running on your machine. The experimental Gaia node version we’ll use includes built-in MCP client support, which allows it to communicate with MCP servers. (Make sure you have a GPU or Apple Silicon machine that meets the requirements – Gaia runs a local large language model. See Gaia’s docs for system requirements.)

1. Install (or update) the Gaia node software: If you haven’t installed Gaia before, the easiest way is via the official install script. Run this in your terminal (macOS/Linux or WSL on Windows):

curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash  
Enter fullscreen mode Exit fullscreen mode

The above command downloads and runs Gaia’s installer script. It will fetch the latest GaiaNet node release and set up the gaianet CLI tool. If you already have Gaia installed, you can run the same command to update to the latest version. After installation, make sure to add Gaia to your PATH as instructed (the installer prints a source ... command; run that or add it to your shell profile).

2. Initialize your Gaia node: Once the CLI is installed, initialize the node with default settings:

gaianet init
Enter fullscreen mode Exit fullscreen mode

On first run, this will download a default LLM model (which can be several GB, so be patient) and set up a configuration file in ~/gaianet/config.json. By default, Gaia uses a fine-tuned Llama 3.2 model and a small knowledge base (e.g. about Paris) to start. You can later customize the model or knowledge, but for now the default is fine.

3. Verify MCP client support: The latest Gaia versions (v0.4.27+ as of 2025) include MCP client functionality. There isn’t a separate installation for MCP – it’s built into the node. However, it may be disabled by default until configured. With the installation run above, you will have an mcp_config.toml file.

Configuring the Weather MCP Server (mcp_config.toml)

We want our Gaia node to use a weather service via MCP. To do that, we must tell Gaia’s MCP client how to connect to a weather MCP server. Gaia looks for an MCP configuration file (let’s call it mcp_config.toml) which lists available MCP servers and how to launch or reach them.

Go to your Gaia config directory (likely ~/gaianet/ in your home folder) and open a file named mcp_config.toml.

You can use the command below to open the file:

nano mcp_config.toml
Enter fullscreen mode Exit fullscreen mode

Open this file in a text editor and add only the following:

[[mcp.server]]
name   = "gaia-weather"
enable = true
host   = "127.0.0.1"
port   = 8002
Enter fullscreen mode Exit fullscreen mode

Let’s break down what this does:

  • The [mcp] section enables MCP support and allows auto-starting servers. “Auto-start” means Gaia will launch the MCP server process on demand (when the AI needs it) instead of expecting it to already be running. This is convenient for local tools.
  • Under [[mcp.servers]], we define one server configuration (the double brackets denote an array of server entries in TOML). Here we name it "weather" – this is the tool name our AI will use to refer to the weather service.
  • We provide the command to start the server. This could be a path to an executable or a script. We’ll set up this mcp-weather command soon (it will be the Weather MCP server).

How to get your OpenWeather API key: If you don’t have one, you can get a free API key from OpenWeather.

Just sign up on their site – once you register, they’ll send you an API key (APPID) via email or your account page. The free tier lets you make a good number of weather queries for personal use. Copy that key and paste it in the config above in place of <YOUR_OPENWEATHER_API_KEY>.

Now that Gaia is configured to know about a "weather" tool and how to launch the server for it, we need to actually install or build that server.

Now you can then configure the env key for your MCP server with the below command:

export OPENWEATHERMAP_API_KEY=Your_KEY
Enter fullscreen mode Exit fullscreen mode

Then, after the above, you can run the server with the command below:

gaia-weather-mcp-server-sse
Enter fullscreen mode Exit fullscreen mode

Launching Gaia and Getting Weather Responses

Start the Gaia node and see it interact with the weather MCP service.

1. Start (or restart) your Gaia node: If this is a fresh setup, just run:

gaianet start
Enter fullscreen mode Exit fullscreen mode

If your node was already running from earlier, stop it (Ctrl+C or gaianet stop) and start it again to load the new MCP config. When Gaia starts, it will initialize the AI model and related services. A successful start will print a message with a local or public URL where you can access the Gaia node’s web interface. It typically looks like Node is live at: https://<random-subdomain>.gaia.domains along with a local URL (http://localhost: etc.).

2. Open the Gaia chat UI: Visit the URL printed by the start command in your browser. You should see a web chat interface for your Gaia node (a simple chatbot UI). This UI lets you chat with the AI agent directly.

3. Ask a weather question: In the chat, try typing something like:

“What’s the weather like in **New York City* right now?”*

When you send this, the Gaia node’s LLM will process your request. Because we’ve enabled the weather tool, the AI has the option to use it. Under the hood, one of two things will happen:

  • Ideally, the model recognizes this query is asking for current weather, and (thanks to Gaia’s function-calling setup) it will produce a structured response indicating it wants to use the weather tool. For example, the model might generate a JSON like: {"tool": "weather", "params": {"city": "New York City"}} (the exact format can vary, but the idea is it requests the weather tool for New York City). Gaia’s MCP client sees this and springs into action.
  • Gaia’s MCP client receives the data and provides it to the LLM as context. The LLM then generates a final answer for you.

After a short moment, you should see the AI respond with the live weather info. For example, it might say:

“Currently, in New York City it’s 27°C with clear skies.”

Your AI agent just pulled real-time data from an external API in a decentralized manner! The key point is that no centralized service (like ChatGPT plugins or a third-party bot) was involved – your Gaia node ran the query through a tool you set up yourself. The entire flow was enabled by the open MCP standard.

New York city screenshot

Wrapping Up: The Potential of Gaia + MCP

In this tutorial, we successfully connected a Gaia decentralized AI node with a live weather service using MCP – creating a proof-of-concept decentralized application. This simple weather bot demo showcases a much larger idea: with protocols like MCP, AI agents can be modular and extensible, pulling in real-time data or performing actions, all while you retain control. Gaia provided the backbone – a self-hosted AI agent – and MCP provided the bridge to external knowledge.

The potential here is huge. You could similarly plug in an MCP server for stock market data, home IoT device control, database queries, or any custom tool. As the MCP ecosystem grows (with many open-source servers available for everything from GitHub issues to Spotify control), your Gaia node can become a hub of decentralized intelligence, combining your personal or community knowledge base with worldwide real-time information. Because it’s decentralized, you are not dependent on any single cloud service or vendor approvals – you mix and match tools freely. It’s the spirit of the open web, brought to AI: anyone can build and share a tool, and any AI agent can use it.

As more developers experiment with Gaia and MCP, we may see autonomous agent networks that coordinate via standardized protocols, marketplaces of MCP services, and personal AI agents that truly become “plug-and-play” with new skills.

Top comments (0)