If you're building AI applications with Azure OpenAI, you know the drill: costs can spiral fast. One experimental feature using o1-preview, a few hundred test runs, and suddenly your bill looks very different from last month.
The Azure portal shows you the numbers eventually, but when you're iterating quickly on AI features, you need real-time visibility. That's exactly what azurecost delivers - instant Azure OpenAI cost tracking from your terminal.
The Azure OpenAI Cost Challenge
Unlike traditional cloud services with predictable pricing, Azure OpenAI costs vary wildly based on:
- Model choice (GPT-4o-mini vs o1-preview is a 5-15x difference)
- Token usage (both prompt and completion tokens)
- Deployment scaling and throughput
- Testing and development cycles
The questions you need answered daily:
- "How much did my o1-preview deployment cost yesterday?"
- "Which resource group is burning through credits?"
- "Did that new feature spike my OpenAI spend?"
- "How does dev environment cost compare to production?"
Checking this through the Azure portal means multiple clicks, page loads, and waiting. When you're checking costs multiple times a day during active development, this friction adds up.
The Solution
azurecost is a Python CLI tool built specifically for developers who need fast answers about their Azure spending. For Azure OpenAI users, it's the fastest way to track Cognitive Services costs without touching the Azure portal.
Why It Works for Azure OpenAI
- Instant visibility: See your OpenAI costs in 2 seconds, not 2 minutes
- Daily granularity: Catch cost spikes the day they happen, not at month-end
- Resource-level tracking: Monitor individual Azure OpenAI accounts separately
- Resource group isolation: Separate dev, staging, and production costs effortlessly
- Multi-dimensional views: Break down by service, location, resource group, or resource ID
- Automation-ready: Python API for integrating into your CI/CD or daily reports
- Auto-currency: Works with any billing currency
Getting Started
Installation takes one line:
pip install azurecost
Log in with Azure CLI (if you haven't already):
az login
Check your Azure OpenAI costs:
azurecost -s your-subscription-name
Output looks like this:
(USD) 2025-11 2025-12
------------------ --------- ---------
total 1247.83 2891.45
Cognitive Services 1247.83 2891.45
That's your Azure OpenAI spend right there - Cognitive Services is the billing category for Azure OpenAI. Notice the spike in December? Now you can investigate what changed.
Real-World Azure OpenAI Use Cases
Scenario 1: Daily Cost Monitoring During Development
You're building a new reasoning agent with o1-preview. Check costs every morning:
azurecost -s prod-subscription -g DAILY -a 7
Output:
(USD) 2025-12-15 2025-12-16 2025-12-17 2025-12-18
------------ ----------- ----------- ----------- -----------
total 45.23 52.18 178.45 51.20
Cognitive Services 45.23 52.18 178.45 51.20
Whoa, December 17th spiked to $178. That's the day you started load testing with o1-preview. Now you know exactly when and how much it costs.
Scenario 2: Environment-Based Cost Breakdown
You have separate resource groups for dev, staging, and production. See costs side by side:
azurecost -s ai-subscription -d ResourceGroup -d ServiceName
Output:
(USD) 2025-11 2025-12
----------------------------------------- --------- ---------
total 1247.83 2891.45
ai-dev-rg/Cognitive Services 342.15 456.32
ai-staging-rg/Cognitive Services 198.42 287.89
ai-prod-rg/Cognitive Services 707.26 2147.24
Production jumped from $707 to $2147. Time to optimize those prompts or consider GPT-4o-mini for some use cases.
Scenario 3: Focused Investigation on Production
Something's wrong with production costs. Drill down to just that resource group:
azurecost -s ai-subscription -r ai-prod-rg -g DAILY -a 14
See two weeks of daily costs for production only. Spot the pattern, correlate with deployments or feature releases.
Scenario 4: Multi-Region Cost Analysis
Running Azure OpenAI deployments in multiple regions? Group by location:
azurecost -s global-ai-sub -d Location -d ServiceName
Output:
(USD) 2025-11 2025-12
------------------------------------ --------- ---------
total 1247.83 2891.45
East US/Cognitive Services 823.14 1923.87
West Europe/Cognitive Services 424.69 967.58
East US is handling most of the load. Maybe redistribute traffic or consider regional pricing differences.
Scenario 5: Resource-Level Cost Analysis
Running multiple Azure OpenAI accounts for different teams or use cases? Track costs at the individual resource level:
azurecost -s ai-subscription -d ResourceId
Output:
(USD) 2025-12 2026-01
-------------------------------------------------------------------------------------- --------- ---------
total 5741.44 16571.60
/resourcegroups/ai/providers/microsoft.cognitiveservices/accounts/chatbot-prod 3401.80 16390.44
/resourcegroups/ai/providers/microsoft.cognitiveservices/accounts/analytics-engine 2194.17 131.82
/resourcegroups/ai/providers/microsoft.cognitiveservices/accounts/internal-tools 145.47 49.34
This shows exactly which Azure OpenAI account is consuming credits. Perfect for:
- Cost attribution: Charge back costs to specific teams or projects
- Identifying cost anomalies: Spot which deployment suddenly increased spend
- Multi-tenant environments: Track costs per customer or tenant
- Budget allocation: Distribute budget based on actual usage patterns
Combine with daily granularity to investigate when a specific resource started costing more:
azurecost -s ai-subscription -d ResourceId -g DAILY -a 7
Why This Matters for Azure OpenAI Users
Building with Azure OpenAI is different from traditional cloud infrastructure. With VMs or databases, costs are fairly predictable. But with modern language models like GPT-4o and o1-preview, costs depend on how users interact with your application:
- Long conversations = more tokens = higher costs
- Complex reasoning tasks = more input tokens
- Detailed responses = more completion tokens
- Testing and iteration = multiplied costs
During active development, you need to check costs frequently. Not once a month when the bill arrives, but daily or even multiple times a day.
The Azure portal workflow kills this feedback loop:
- Open browser
- Navigate to Azure portal (wait for load)
- Find the right subscription
- Click through to Cost Management
- Configure time range and filters
- Wait for data to render
By the time you see the numbers, you've burned 2-3 minutes. When you're doing this multiple times daily, the friction discourages you from checking at all.
Then you're surprised at month-end when the bill is 3x what you expected.
azurecost fixes this. Checking costs becomes as fast as checking git status:
azurecost -s ai-subscription -g DAILY -a 7
Two seconds. Real-time feedback. No context switching.
This speed changes behavior. When checking costs is instant, you actually do it. You catch issues early. You experiment with confidence because you're monitoring the impact.
The tool emerged from my own need while building AI features. I was spending too much time in the portal doing the same query repeatedly. I wanted something terminal-based that integrated into my development workflow.
What started as a personal script evolved into a proper tool that my team adopted, then others in the community found useful. The philosophy is simple: do one thing well - show Azure costs fast and clearly.
Automate Azure OpenAI Cost Monitoring
The Python API lets you integrate cost tracking into your workflows. Send daily Azure OpenAI cost reports to Slack:
from azurecost import Azurecost
import requests
import os
# Get yesterday's costs
core = Azurecost(
debug=False,
granularity="DAILY",
dimensions=["ResourceGroup", "ServiceName"],
subscription_name="ai-production"
)
total_results, results = core.get_usage(ago=7)
text = core.convert_tabulate(total_results, results)
# Send to Slack
webhook_url = os.getenv("SLACK_WEBHOOK_URL")
message = f"🤖 Azure OpenAI Cost Report (Last 7 Days)\n```
{% endraw %}
\n{text}\n
{% raw %}
```"
requests.post(webhook_url, json={"text": message})
Run this as a daily cron job or GitHub Action. Your team gets automatic cost visibility without anyone checking the portal.
Other use cases:
- Budget alerts: Trigger warnings when daily costs exceed thresholds
- Cost attribution: Track which team or feature is using OpenAI credits
- Pre-deployment checks: Validate costs before promoting to production
- Finance automation: Generate reports for accounting without manual exports
Configuration Tips for AI Workloads
Set environment variables to streamline your daily checks:
# Add to your ~/.zshrc or ~/.bashrc
export AZURE_SUBSCRIPTION_ID=your-ai-subscription-id
export AZURE_RESOURCE_GROUP=ai-prod-rg
Now run azurecost with no arguments:
azurecost -g DAILY -a 7
Create shell aliases for common queries:
# Daily OpenAI costs
alias aicosts='azurecost -s ai-subscription -g DAILY -a 7'
# Production environment only
alias prodcosts='azurecost -s ai-subscription -r ai-prod-rg -g DAILY -a 7'
# Compare all environments
alias envcosts='azurecost -s ai-subscription -d ResourceGroup'
# Resource-level breakdown
alias resourcecosts='azurecost -s ai-subscription -d ResourceId'
Type aicosts each morning as part of your routine. Takes 2 seconds, keeps you informed. Use resourcecosts when you need to drill down to individual Azure OpenAI accounts.
Best Practices for Azure OpenAI Cost Management
Based on using azurecost with AI workloads, here are patterns that work:
1. Daily morning check
azurecost -s ai-sub -g DAILY -a 7
Catch anomalies before they compound. One day of unexpected costs is manageable. A month is not.
2. Before and after feature releases
# Before deployment
azurecost -s ai-sub -r ai-prod-rg -g DAILY -a 3
# Deploy new feature
# After 24 hours
azurecost -s ai-sub -r ai-prod-rg -g DAILY -a 3
Measure the cost impact of new features. Make data-driven decisions about o1-preview vs GPT-4o-mini.
3. Set up automated alerts
Use the Python API to send daily reports. Don't rely on remembering to check manually.
4. Isolate environments
Use separate resource groups for dev, staging, production. Makes cost attribution trivial.
5. Monitor during load testing
Run azurecost before and after load tests. Understand your cost-per-request at scale before going live.
Start Tracking Your Azure OpenAI Costs Now
Install the tool:
pip install azurecost
Check your costs:
azurecost -s your-subscription -g DAILY -a 7
That's it. You now have instant visibility into your Azure OpenAI spending.
Check the GitHub repository for full documentation, API examples, and to report issues or contribute.
Building AI features is expensive enough. Don't let invisible costs surprise you. Make cost visibility effortless.
Quick Links
- GitHub: toyama0919/azurecost
- PyPI: azurecost
- Python 3.8+ required
Building with Azure OpenAI? Drop a comment on how you're managing costs or share your use case. Always looking for feedback and ideas.
Top comments (0)