TL;DR — Spin up a Snowflake Cortex Agent that can answer admin questions like "Which warehouses burned the most credits last month?" or "Who are my top 5 spenders?" — all from a natural-language prompt inside GitHub Copilot Chat.
The Problem
Snowflake ACCOUNT_USAGE contains everything you need to understand spend, performance, and security. The catch? You have to know the right table, the right join, and write it correctly every time.
What if you could just ask?
What You'll Build
GitHub Copilot Chat
│ (natural language)
▼
MCP Server (Python, local)
│ ask_admin tool
▼
Snowflake Cortex Agent
│ routes to the right semantic view
▼
SNOWFLAKE.ACCOUNT_USAGE (+ optional ORGANIZATION_USAGE)
By the end you'll be able to type questions like these into Copilot Chat and get back answers with real numbers:
- What are total compute credits by warehouse in the last 30 days?
- Which users consumed the most credits this month?
- How many failed logins occurred in the last 7 days?
- What are serverless credits by service type?
Prerequisites
| Requirement | Notes |
|---|---|
| Snowflake account (any paid edition) |
ACCOUNTADMIN or a role with ACCOUNT_USAGE access |
| Python 3.9+ | For the MCP bridge |
| VS Code + GitHub Copilot Chat | Agent mode enabled |
| Git | To clone the template |
Step 1 — Clone the Template
git clone https://github.com/LALITHASWAROOPK/agent_snowflake_admin_assistant.git
cd agent_snowflake_admin_assistant/agent_snowflake_admin
Open the folder in VS Code:
code .
Step 2 — Deploy the SQL Objects (5 scripts, ~5 minutes)
All SQL is in the sql/ folder. Replace the four placeholders before running:
| Placeholder | What to set |
|---|---|
<APP_DB> |
A database to hold your views/agent |
<APP_SCHEMA> |
Schema inside that database |
<ADMIN_ROLE> |
Role that owns the objects |
<DEVELOPER_ROLE> |
Role that will query the agent |
<EXEC_WAREHOUSE> |
Warehouse the agent will use at runtime |
Run them in order in a Snowflake worksheet or SnowSQL:
@sql/01_create_views.sql -- Base views over ACCOUNT_USAGE
@sql/02_create_semantic_views.sql -- Semantic layer (natural-language metadata)
@sql/03_create_agent.sql -- Cortex Agent wired to those views
@sql/04_create_budget.sql -- Optional: spending guardrails
@sql/05_grants.sql -- RBAC grants
What the Views Cover
01_create_views.sql builds seven base views, all sourced exclusively from SNOWFLAKE.ACCOUNT_USAGE — no custom tables, no proprietary schemas:
| View | Source |
|---|---|
V_WAREHOUSE_COST_BY_TAG |
WAREHOUSE_METERING_HISTORY + TAG_REFERENCES
|
V_QUERY_PERFORMANCE |
QUERY_HISTORY |
V_USER_SPEND |
QUERY_ATTRIBUTION_HISTORY |
V_SERVERLESS_COSTS |
Tasks, pipes, clustering, search, MV refresh |
V_STORAGE_USAGE |
STORAGE_USAGE + DATABASE_STORAGE_USAGE_HISTORY
|
V_LOGIN_HISTORY |
LOGIN_HISTORY |
V_DATA_TRANSFER |
DATA_TRANSFER_HISTORY + REPLICATION_USAGE_HISTORY
|
02_create_semantic_views.sql wraps each base view with a COMMENT block that tells the Cortex Agent what the view is for — this is how the agent knows which view to query for any given question.
Why Semantic Views?
Snowflake Cortex Agents use semantic views as tool definitions. A semantic view is just a standard view with a COMMENT that describes its purpose in natural language. The agent reads those comments at inference time to decide which tool (view) answers the user's question.
No prompt engineering required on your end.
Step 3 — Start the MCP Bridge
The MCP (Model Context Protocol) server is a lightweight Python HTTP server that exposes one tool — ask_admin — which Copilot Chat can call.
Install dependencies
pip install -r mcp/requirements.txt
Configure your environment
Create a .env file in the repo root (never commit this):
SNOWFLAKE_ACCOUNT=<your_account>
SNOWFLAKE_USER=<your_user>
SNOWFLAKE_AUTHENTICATOR=externalbrowser
SNOWFLAKE_ROLE=<DEVELOPER_ROLE>
SNOWFLAKE_WAREHOUSE=<EXEC_WAREHOUSE>
SNOWFLAKE_AGENT_FQN=<APP_DB>.<APP_SCHEMA>.<AGENT_NAME>
MCP_PORT=3000
Supports
externalbrowser(SSO),snowflake(password), and OAuth — setSNOWFLAKE_AUTHENTICATORand the matching credential variable.
Launch the server
./scripts/start-mcp.ps1
Or directly:
python mcp/server.py
Verify it's up:
Invoke-RestMethod -Uri http://127.0.0.1:3000/health -Method Get
# → {"status":"ok"}
Step 4 — Connect Copilot Chat
The repo ships a pre-configured .vscode/mcp.json that wires the local MCP server into VS Code:
{
"mcpServers": {
"snowflake-admin": {
"url": "http://127.0.0.1:3000"
}
}
}
VS Code picks this up automatically when you open the workspace. You'll see the ask_admin tool listed inside Copilot Chat's tool picker.
There's also a GitHub Copilot instruction file at .github/instructions/admin.instructions.md which gives Copilot context about the agent's scope so it routes questions there automatically.
Step 5 — Ask It Questions
Switch Copilot Chat to Agent mode and try:
What are my top 5 warehouses by compute credits in the last 30 days?
Which users have spent the most credits this month?
Show me failed logins by user in the last 7 days.
What serverless services are costing me the most right now?
What's my current storage usage broken down by database?
The agent selects the right semantic view, runs the query, and returns a plain-English answer with the numbers inline — no SQL, no context switching.
Optional: Budget Guardrails
04_create_budget.sql sets up a Snowflake Budget object so you can enforce credit limits at the agent level. Handy for keeping a shared admin agent from running runaway queries:
CALL <APP_DB>.<APP_SCHEMA>.<BUDGET_NAME>!SET_SPENDING_LIMIT(100);
CALL <APP_DB>.<APP_SCHEMA>.<BUDGET_NAME>!GET_SPENDING_HISTORY();
Architecture Recap
┌─────────────────────────────────────┐
│ GitHub Copilot Chat │
│ (Agent mode + ask_admin tool) │
└────────────────┬────────────────────┘
│ HTTP / MCP
┌────────────────▼────────────────────┐
│ MCP Bridge (mcp/server.py) │
│ Exposes: /mcp/tools/list │
│ /mcp/tools/call │
│ /health │
└────────────────┬────────────────────┘
│ snowflake-connector-python
┌────────────────▼────────────────────┐
│ SNOWFLAKE.CORTEX.DATA_AGENT_RUN │
│ ← Cortex Agent (semantic router) │
└────────────────┬────────────────────┘
│ picks the right tool
┌────────────────▼────────────────────┐
│ Semantic Views (SV_*) │
│ Base Views (V_*) │
│ ↑ SNOWFLAKE.ACCOUNT_USAGE │
└─────────────────────────────────────┘
Key Things to Know
-
Account Usage latency:
SNOWFLAKE.ACCOUNT_USAGEis typically delayed by up to a few hours — not real-time. -
Rate sheet:
ORGANIZATION_USAGE.RATE_SHEET_DAILYis optional and depends on your edition/region. -
Auth flexibility: The MCP server supports SSO (
externalbrowser), OAuth, and password auth — configure once in.env.
Get the Code
PRs and issues welcome. If you build on top of this, tag me — I'd love to see what you add.
Top comments (0)