DEV Community

xbill for Google Developer Experts

Posted on • Originally published at xbill999.Medium on

MCP Development with Python, and Azure Fabric

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications with Python with a local development environment deployed to Azure Fabric.

What is Gemini CLI?

The Gemini CLI is an open-source, terminal-based AI agent from Google that allows developers to interact directly with Gemini models, such as Gemini 2.5 Pro, for coding, content creation, and workflow automation. It supports file operations, shell commands, and connects to external tools via the Model Context Protocol (MCP).

The full details on Gemini CLI are available here:

Build, debug & deploy with AI

Azure Fabric

Microsoft Fabric is a comprehensive, AI-powered cloud platform that unifies data engineering, data warehousing, data science, and business intelligence (using Power BI) into a single SaaS solution, built on the OneLake storage system. It simplifies data management by reducing reliance on disparate, siloed tools.

More details are here:

What is Microsoft Fabric - Microsoft Fabric

Why would I want Gemini CLI with Azure? Isn’t that a Google Thing?

Yes- Gemini CLI leverages the Google Cloud console and Gemini models but it is also open source and platform agnostic. Many applications are already cross-cloud so this enables familiar tools to be run natively on Microsoft Azure.

Node Version Management

Gemini CLI needs a consistent, up to date version of Node. The nvm command can be used to get a standard Node environment:

GitHub - nvm-sh/nvm: Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions

Gemini CLI Installation

You can then download the Gemini CLI :

npm install -g @google/gemini-cli
Enter fullscreen mode Exit fullscreen mode

You will see the log messages:

azureuser@azure-new:~/gemini-cli-azure$ npm install -g @google/gemini-cli
npm warn deprecated prebuild-install@7.1.3: No longer maintained. Please contact the author of the relevant native addon; alternatives are available.
npm warn deprecated node-domexception@1.0.0: Use your platform's native DOMException instead
npm warn deprecated glob@10.5.0: Old versions of glob are not supported, and contain widely publicized security vulnerabilities, which have been fixed in the current version. Please update. Support for old versions may be purchased (at exorbitant rates) by contacting i@izs.me
Enter fullscreen mode Exit fullscreen mode

Testing the Gemini CLI Environment

Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:

gemini
Enter fullscreen mode Exit fullscreen mode

Authentication

Several authentication options are available. To use an existing Code Assist licence — authenticate with a Google Account:

> /auth                                                                                                                                                        
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ │
│ ? Get started │
│ │
│ How would you like to authenticate for this project? │
│ │
│ ● 1. Login with Google │
│ 2. Use Gemini API Key │
│ 3. Vertex AI │
│ │
│ (Use Enter to select) │
│ │
│ Terms of Services and Privacy Notice for Gemini CLI │
│ │
│ https://geminicli.com/docs/resources/tos-privacy/ │
│ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
Enter fullscreen mode Exit fullscreen mode

Then set the GOOGLE_CLOUD_PROJECT to a valid project setup on the Google Cloud console:

~ $ export GOOGLE_CLOUD_PROJECT=comglitn
~ $
Enter fullscreen mode Exit fullscreen mode

Other options include Google Cloud API Key that can be generated directly from the Google Cloud Console.

Installing Google Cloud Tools

To simplify working with Google Cloud — install the Google Cloud Tools:

https://docs.cloud.google.com/sdk/docs/install-sdk
Enter fullscreen mode Exit fullscreen mode

Once the installation is completed — you can verify the setup:

william@Azure:~$ gcloud auth list
  Credentialed Accounts
ACTIVE ACCOUNT
* xbill@glitnir.com
Enter fullscreen mode Exit fullscreen mode

Installing Azure Customized GEMINI.md

A sample GitHub repo contains tools for working with Gemini CLI on Azure. This repo is available here:

git clone https://gitHub.com/xbill9/gemini-cli-azure
Enter fullscreen mode Exit fullscreen mode

A sample GEMINI.md customized for the Azure environment is provided in the repo:

This is a multi linux git repo hosted at:

github.com/xbill9/gemini-cli-azure

You are a cross platform developer working with 
Microsoft Azure and Google Cloud

You can use the Azure CLI :
https://learn.microsoft.com/en-us/cli/azure/install-azure-cli
https://learn.microsoft.com/en-us/cli/azure/
https://learn.microsoft.com/en-us/cli/azure/reference

https://learn.microsoft.com/en-us/cli/azure/install-azure-cli-linux?view=azure-cli-latest&pivots=apt

## Azure CLI Tools

You can use the Azure CLI to manage resources across Azure Storage, Virtual Machines, and other services.

- **List Resource Groups** : `az group list -o table`
- **List Storage Accounts** : `az storage account list -o table`
- **List Virtual Machines** : `az vm list -d -o table`

### Azure Update Script

- `azure-update`: This script is specifically for Azure Linux environments. It updates all packages and ensures necessary libraries are installed.

## Automation Scripts

This repository contains scripts for updating various Linux environments and tools:

- `linux-update`: Detects OS (Debian/Ubuntu/Azure Linux) and runs the corresponding update scripts.
- `azure-update`: Updates Azure Linux packages and installs necessary dependencies.
- `debian-update`: Updates Debian/Ubuntu packages and installs `git`.
- `gemini-update`: Updates the `@google/gemini-cli` via npm and checks versions of Node.js and Gemini.
- `nvm-update`: Installs NVM (Node Version Manager) and Node.js version 25.
Enter fullscreen mode Exit fullscreen mode

Python MCP Documentation

The official GitHub Repo provides samples and documentation for getting started:

GitHub - modelcontextprotocol/python-sdk: The official Python SDK for Model Context Protocol servers and clients

The most common MCP Python deployment path uses the FASTMCP library:

Welcome to FastMCP - FastMCP

Where do I start?

The strategy for starting MCP development is a incremental step by step approach.

First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.

Then, a minimal Hello World Style Python MCP Server was built with stdio transport. This server was validated with Gemini CLI in the local environment.

This current setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the Python MCP server both run in the same local environment.

Next- the basic MCP server is extended with Gemini CLI to add several new tools in standard Python code.

Setup the Basic Environment

At this point you should have a working Python interpreter and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:

cd ~
git clone https://github.com/xbill9/gemini-cli-azure
Enter fullscreen mode Exit fullscreen mode

Then run init.sh from the cloned directory.

The script will attempt to determine your shell environment and set the correct variables:

cd gemini-cli-azure
source init.sh
Enter fullscreen mode Exit fullscreen mode

If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:

cd gemini-cli-azure
source set_env.sh
Enter fullscreen mode Exit fullscreen mode

Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.

Hello World with HTTP Transport

One of the key features that the standard MCP libraries provide is abstracting various transport methods.

The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.

The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.

The HTTP transport allows the MCP Client and Server to be in the same environment or distributed over the Internet.

The connection over HTTP will look similar to this:

mcp.run(
        transport="http",
        host="0.0.0.0",
        port=port,
    )
Enter fullscreen mode Exit fullscreen mode

Running the Python Code

First- switch the directory with the Python MCP sample code:

xbill@penguin:~/gemini-cli-azure/mcp-fabric-python-azure$ make
pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode

You can validate the final result by checking the messages:

Successfully installed azure-core-1.38.3 azure-identity-1.25.3 azure-mgmt-core-1.6.0 azure-mgmt-resource-25.0.0 flake8-7.3.0 isodate-0.7.2 mccabe-0.7.0 msal-1.35.1 msal-extensions-1.3.1 pycodestyle-2.14.0 pyflakes-3.4.0
python main.py
{"message": "No environment configuration found."}
{"message": "ManagedIdentityCredential will use IMDS"}
{"message": "Azure Credentials initialized for Fabric integration"}
{"message": "\ud83d\ude80 MCP server started on port 8080"}
{"message": "HTTP Request: GET https://pypi.org/pypi/fastmcp/json \"HTTP/1.1 200 OK\""}
Enter fullscreen mode Exit fullscreen mode

Once you have validated the server can run locally — exit with control-c.

Then run a deployment to Azure Fabric container service:

xbill@penguin:~/gemini-cli-azure/mcp-fabric-python-azure$ make deploy
azureuser@azure-new:~/gemini-cli-azure/mcp-fabric-python-azure$ make deploy
Ensuring Resource Group mcp-rg-fabric exists...
{
  "id": "/subscriptions/3db3ce66-50b6-4d11-91ef-5950cf4039ed/resourceGroups/mcp-rg-fabric",
  "location": "westus2",
  "managedBy": null,
  "name": "mcp-rg-fabric",
  "properties": {
    "provisioningState": "Succeeded"
  },
  "tags": null,
  "type": "Microsoft.Resources/resourceGroups"
}
Checking if ACR mcpacrazurenewv2 exists...
Building the Docker image in Azure Container Registry...
...
  "resourceGroup": "mcp-rg-fabric",
  "systemData": {
    "createdAt": "2026-03-17T16:37:35.3858326",
    "createdBy": "xbill@glitnir.com",
    "createdByType": "User",
    "lastModifiedAt": "2026-03-17T18:09:58.8498981",
    "lastModifiedBy": "xbill@glitnir.com",
    "lastModifiedByType": "User"
  },
  "type": "Microsoft.App/containerApps"
}
Deployment complete. Visit: https://mcp-fabric-server.mangorock-a40878f0.westus2.azurecontainerapps.io

Enter fullscreen mode Exit fullscreen mode

Gemini CLI settings.json

The default Gemini CLI settings.json has an entry for the Python source:

 {
    "mcpServers": {
    "azure-fabric-python": {
      "httpUrl": "https://mcp-fabric-server.mangorock-a40878f0.westus2.azurecontainerapps.io/mcp"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Validation with Gemini CLI

Leaver the MCP server window running. Start a new shell. Gemini CLI is restarted and the MCP connection over HTTP to the Python Code is validated, The full Gemini CLI Session will start:

azureuser@azure-new:~/gemini-cli-azure/mcp-fabric-python-azure$ gemini

  ▝▜▄ Gemini CLI v0.33.2
    ▝▜▄
   ▗▟▀ Logged in with Google /auth
  ▝▀ Gemini Code Assist Standard /upgrade

                                                                                                                              ? for shortcuts 
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 shift+tab to accept edits 3 GEMINI.md files | 1 MCP server
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 > 
Enter fullscreen mode Exit fullscreen mode

And you can then connect to the MCP Server over HTTP:

> greet Azure Fabric!
✦ I will call the mcp_azure-fabric-python_greet tool with the parameter "Azure Fabric!".

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Action Required │
│ │
│ ? greet (azure-fabric-python MCP Server) {"param":"Azure Fabric!"} │
│ │
│ MCP Server: azure-fabric-python │
│ Tool: greet │
│ │
│ MCP Tool Details: │
│ (press Ctrl+O to expand MCP tool details) │
│ Allow execution of MCP tool "greet" from server "azure-fabric-python"? │
│ │
│ 1. Allow once │
│ 2. Allow tool for this session │
│ 3. Allow all server tools for this session │
│ ● 4. Allow tool for all future sessions │
│ 5. No, suggest changes (esc) │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

✦ I will call the mcp_azure-fabric-python_greet tool with the parameter "Azure Fabric!".

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (azure-fabric-python MCP Server) {"param":"Azure Fabric!"} │
│ │
│ Azure Fabric! │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ Azure Fabric!

Enter fullscreen mode Exit fullscreen mode

Project Review

Finally — you can use Gemini CLI to review the project:

✦ This project is a Python-based Model Context Protocol (MCP) server specifically optimized for Microsoft Fabric and Azure Container Apps
  (ACA). It serves as a backend for Fabric custom workloads, allowing LLMs to interact with Azure and Fabric resources securely.

  Key Components & Functionality

   * FastMCP Server: Implemented in main.py using the FastMCP framework with HTTP transport (running on port 8080 by default).
   * Azure Integration: Uses azure-identity (DefaultAzureCredential) to provide seamless authentication for Fabric and Azure services (e.g.,
     OneLake, Fabric APIs).
   * Cloud Native Deployment:
       * Dockerfile: Provides a multi-stage build using Python 3.13.
       * Makefile: Contains automated workflows for local installation (make install), Azure login (make az-login), and full deployment to ACA
         (make deploy).
   * Built-in Capabilities:
       * Health Check: A /health endpoint for monitoring service status.
       * Greeting Tool: A sample greet tool demonstrating how to extend the server with custom functions.
   * Logging: Features structured JSON logging via python-json-logger for better observability in cloud environments.

  In short, it provides a "ready-to-deploy" template for building custom AI-driven extensions for the Microsoft Fabric ecosystem.

Enter fullscreen mode Exit fullscreen mode

Summary

The strategy for using Python for MCP development with Gemini CLI was validated with a incremental step by step approach.

A minimal HTTP transport MCP Server was started from Python source code and validated with Gemini CLI running as a MCP client in the same local environment. Then this solution was deployed remotely to the Azure Fabric system and validated with the local installation of Gemini CLI.

This approach can be extended to more complex deployments using other MCP transports and Cloud based options.

Top comments (0)