DEV Community

xbill for Google Developer Experts

Posted on • Originally published at xbill999.Medium on

Building ADK Agents with Azure Cloud Shell

This article covers installing and Running Gemini CLI cross-cloud. The Azure Cloud Shell provides a basic Linux environment that can support running Gemini CLI.

Azure Cloud Shell

Azure Cloud Shell is a browser-based, authenticated terminal for managing Azure resources, accessible directly through the Azure portal, shell.azure.com, or the mobile app. It provides a preconfigured environment with Bash or PowerShell, eliminating local installation of tools. It runs on a temporary, secure, Linux-based host.

The full background information on the Cloud Shell is here:

https://azure.microsoft.com/en-us/get-started/azure-portal/cloud-shell

More information is available here:

Get started with Azure Cloud Shell

Why would I want Gemini CLI on Azure? Isn’t that a Google Thing?

Yes- Gemini CLI leverages the Google Cloud console and Gemini models but it is also open source and platform agnostic. Many applications are already cross-cloud so this enables familiar tools to be run natively on Microsoft Azure.

Activating Azure CloudShell

The Azure CloudShell comes preinstalled on all Azure Accounts.

To access the shell — click the Terminal Icon in the upper right hand corner of the console:

Then activate the full shell:

Note- the disk space in the environment is limited to 5GB- but this is sufficient for Gemini CLI, Google Cloud Tools, and some customization.

What is this project trying to Do?

This project aims to configure an ADK agent to use an Azure Foundry model. The ADK provides agent support in standard Python code.

Lite LLM

LiteLLM is an open-source AI gateway and Python SDK that provides a unified OpenAI-compatible interface to over 100 LLMs (Anthropic, Gemini, Azure, Bedrock, Ollama). It simplifies API management by allowing users to call any model using the same format, providing features like spend tracking, model fallbacks, key management, and load balancing.

More info is here:

GitHub - BerriAI/litellm: Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

Azure Foundry

Azure AI Foundry (formerly Azure AI Studio) is a unified platform-as-a-service for developing, customizing, and managing AI agents and applications. It serves as an AI “factory” combining model catalogs (including OpenAI and Hugging Face), low-code tools, and enterprise-grade security for building generative AI apps. [1, 2, 3, 4]

More details are available here:

https://azure.microsoft.com/en-us/products/ai-foundry

Why would I want Gemini CLI with Azure? Isn’t that a Google Thing?

Yes- Gemini CLI leverages the Google Cloud console and Gemini models but it is also open source and platform agnostic. Many applications are already cross-cloud so this enables familiar tools to be run natively on Microsoft Azure.

Azure CLI

The Azure Command-Line Interface (CLI) is a cross-platform tool used to connect to Azure and execute administrative commands on Azure resources. It allows you to manage services like virtual machines, databases, and networking through a terminal using interactive prompts or scripts. [1, 2, 3]

More details are available here:

What is the Azure Developer CLI?

Gemini CLI

If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:

npm install -g @google/gemini-cli
Enter fullscreen mode Exit fullscreen mode

Testing the Gemini CLI Environment

Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:

xbill@penguin:~/aisprintapr2026/tpu-vllm-devops-agent$ gemini

▝▜▄ Gemini CLI v0.39.1
   ▝▜▄
  ▗▟▀ Signed in with Google /auth
 ▝▀ Plan: Gemini Code Assist Standard /upgrade
Enter fullscreen mode Exit fullscreen mode

Python MCP Documentation

The official GitHub Repo provides samples and documentation for getting started:

GitHub - modelcontextprotocol/python-sdk: The official Python SDK for Model Context Protocol servers and clients

Agent Development Kit

The Google Agent Development Kit (ADK) is an open-source, Python-based framework designed to streamline the creation, deployment, and orchestration of sophisticated, multi-agent AI systems. It treats agent development like software engineering, offering modularity, state management, and built-in tools (like Google Search) to build autonomous agents.

The ADK can be installed from here:

Agent Development Kit (ADK)

Where do I start?

The strategy for starting multi agent development is a incremental step by step approach.

First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.

Then, and ADK agent is built, debugged, and tested locally.

Setup the Basic Environment

At this point you should have a working Python environment and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:

cd ~
git clone https://github.com/xbill9/gemini-cli-azure
Enter fullscreen mode Exit fullscreen mode

Then run init.sh from the cloned directory.

The script will attempt to determine your shell environment and set the correct variables:

cd adk-litellm
source init.sh
Enter fullscreen mode Exit fullscreen mode

If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:

source set_env.sh
Enter fullscreen mode Exit fullscreen mode

Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.

Azure Setup

The Azure CLI provides the basic tools for working with the cloud based services:

Check the installation:

william [~/gemini-cli-azure/adk-litellm-azure]$ az --version
azure-cli 2.85.0 *

core 2.85.0 *
telemetry 1.1.0

Extensions:
ml 2.42.0
ssh 2.0.6

Dependencies:
msal 1.35.1
azure-mgmt-resource 24.0.0

Python location '/usr/bin/python3.12'
Config directory '/home/william/.azure'
Extensions directory '/home/william/.azure/cliextensions'
Extensions system directory '/usr/lib/python3.12/site-packages/azure-cli-extensions'

Python (Linux) 3.12.9 (main, Mar 26 2026, 23:21:55) [GCC 13.2.0]

Legal docs and information: aka.ms/AzureCliLegal

You have 2 update(s) available. They will be updated with the next build of Cloud Shell.
william [~/gemini-cli-azure/adk-litellm-azure]$ 
Enter fullscreen mode Exit fullscreen mode

Then run the Azure setup scripts:

./setup-azure-phi.sh

./get-azure-phi-creds.sh
Enter fullscreen mode Exit fullscreen mode

The Azure console should look something like this:

With the resource group looking like this:

Running the Python Code

First- switch the directory with the Python version of the MCP sample code. Run the release version on the local system:

cd adk-litellm$ make install
Processing ./.
Enter fullscreen mode Exit fullscreen mode

Verify The ADK Installation

First- verify the ADK is installed:

william [~/gemini-cli-azure/adk-litellm-azure]$ adk --version
adk, version 1.32.0
william [~/gemini-cli-azure/adk-litellm-azure]$ 
Enter fullscreen mode Exit fullscreen mode

To check the ADK setup, run the ADK CLI locally with the hello Agent. This is a basic Hello World style agent. No external tools or MCP calls are used in the Agent code- which allows the ADK installation to be checked:

william [~/gemini-cli-azure/adk-litellm-azure]$ adk run agents/azure
Log setup complete: /tmp/agents_log/agent.20260505_192533.log
To access latest log: tail -F /tmp/agents_log/agent.latest.log
/home/william/.local/lib/python3.12/site-packages/google/adk/cli/cli.py:204: UserWarning: [EXPERIMENTAL] InMemoryCredentialService: This feature is experimental and may change or be removed in future versions without notice. It may introduce breaking changes at any time.
  credential_service = InMemoryCredentialService()
/home/william/.local/lib/python3.12/site-packages/google/adk/auth/credential_service/in_memory_credential_service.py:33: UserWarning: [EXPERIMENTAL] BaseCredentialService: This feature is experimental and may change or be removed in future versions without notice. It may introduce breaking changes at any time.
  super(). __init__ ()
Running agent azure_agent, type exit to exit.
[user]: what model are you
19:25:39 - LiteLLM:INFO: utils.py:4011 - 
LiteLLM completion() model= phi-4-mini; provider = azure_ai
[azure_agent]: I am Phi, an AI developed by Microsoft, created to help assist you with providing information, answering questions, and helping out with related tasks. How can I assist you today?


Phi Models on Azure

Microsoft Phi models are a family of highly capable, cost-effective Small Language Models (SLMs) designed for low-latency generative AI, available on Azure AI Foundry. They support text, image, and audio inputs (Phi-4-multimodal) and are ideal for memory-constrained environments. Key models include Phi-4, Phi-3.5-MoE, and Phi-3-mini/medium, optimized for high-performance reasoning tasks.

More details are here:

Deep Dive with a Phi Model

Once the connection has been tested. You can drill into the model:

xbill@penguin:~/gemini-cli-azure/adk-litellm-azure$ adk run agents/azure
Log setup complete: /tmp/agents_log/agent.20260505_134754.log
To access latest log: tail -F /tmp/agents_log/agent.latest.log
/home/xbill/.local/lib/python3.13/site-packages/google/adk/cli/cli.py:204: UserWarning: [EXPERIMENTAL] InMemoryCredentialService: This feature is experimental and may change or be removed in future versions without notice. It may introduce breaking changes at any time.
  credential_service = InMemoryCredentialService()
/home/xbill/.local/lib/python3.13/site-packages/google/adk/auth/credential_service/in_memory_credential_service.py:33: UserWarning: [EXPERIMENTAL] BaseCredentialService: This feature is experimental and may change or be removed in future versions without notice. It may introduce breaking changes at any time.
  super(). __init__ ()
Running agent azure_agent, type exit to exit.
[user]: what model are you
13:48:00 - LiteLLM:INFO: utils.py:4011 - 
LiteLLM completion() model= phi-4-mini; provider = azure_ai
[azure_agent]: I am Phi, an AI developed by Microsoft, designed to help you with a wide range of queries and tasks. How can I assist you today?


One Last Code Review with Gemini CLI

Gemini CLI was used to summarize the project:

> code review adk agent for phi model                                                                                                            

✦ I've reviewed the Azure agent implementation in agents/azure/agent.py. Overall, the implementation correctly utilizes the ADK (Agent Development
  Kit) and integrates with Azure AI Foundry via LiteLLM.

  🔍 Code Review Findings

  1. ADK Best Practices
   * Missing description: While the name is correctly defined, the description parameter is missing. In ADK, the description is critical for
     multi-agent systems, as it helps routers and other agents understand when to delegate tasks to this specific agent.
   * Minimal instruction: The current instruction is very brief. ADK best practices suggest using Markdown for clarity and being more specific
     about the agent's persona and constraints.
   * Short request_timeout: The timeout is set to 10 seconds. For LLM inferences, especially via serverless endpoints which may have occasional
     cold starts or latency, 30 seconds is generally more reliable.

  2. Azure AI Foundry Integration
   * Model String: The model string azure_ai/phi-4-mini is correctly formatted for LiteLLM to recognize the Azure AI Foundry provider.
   * Environment Variables: The code correctly leverages AZURE_MODEL with a fallback. It assumes AZURE_AI_API_KEY and AZURE_AI_API_BASE are set in
     the environment, which matches the setup-azure-phi.sh automation script.

  3. Code Quality
   * Imports: Clean and follows standard ADK patterns.
   * Encapsulation: Using root_agent as the export in agent.py (and re-exporting in __init__.py) is the standard pattern for ADK's discovery
     mechanism.
Enter fullscreen mode Exit fullscreen mode

Summary

The strategy for using the ADK with Azure LLMs was validated with a incremental step by step approach.

First- the basic environment tools were configured. Then the required Python libraries were installed. LiteLLM was used to proxy the Google ADK to a Model hosted on Azure. Finally, some tests were run from the CLI and the Web interface. Gemini CLI was used to validate the approach.

Enter fullscreen mode Exit fullscreen mode
Enter fullscreen mode Exit fullscreen mode

Top comments (0)