Microsoft and OpenAI Break Their Exclusive Deal: What My API Usage Logs Actually Say About Who Benefits
Back in 2009, at 18, I was studying for my CCNA at night after eight hours working at a cybercafé. Cisco had a near-monopoly on enterprise networking in Argentina. I remember thinking: "if Cisco breaks something with a partner, why should I care? I still need to know OSPF." Today, reading that Microsoft and OpenAI dissolved their exclusivity agreement — the big news that dominated Hacker News with 880 points in just a few hours — I got exactly the same feeling. Tons of noise at the top. Down in my logs, the story is more boring and more honest.
But there's one exception. And I found it on my March invoice.
Microsoft and OpenAI Exclusivity Deal: What Actually Changed and What Didn't
For anyone who already has the week's context: I'm not going to rehash the 2019 deal, the $13 billion, or the distribution rights. That's all out there. What's new is that Microsoft no longer holds exclusivity over the OpenAI API for cloud providers. Any other vendor — Google Cloud, AWS, Oracle — can now offer direct access to GPT-4o, o1, or whatever comes next, without going through Azure OpenAI Service.
My concrete thesis: this change benefits almost exclusively OpenAI, marginally benefits competing hyperscalers, and for the independent dev calling the API directly it's noise — except for one pricing variable that's actually worth understanding.
This isn't a hot take. It's what I read in my own numbers.
What My Last 90 Days of Logs Say
I run my projects on Railway. I call the OpenAI API directly — I never went through Azure OpenAI Service because the setup overhead never made sense for small projects. When I looked at my logs from the last 90 days, the pattern was clear:
# Extract calls by model and cost - last 90 days
# File: analyze_api_logs.sh
#!/bin/bash
LOGS_DIR="./logs/api"
echo "=== Call distribution by model ==="
grep '"model"' $LOGS_DIR/*.jsonl | \
jq -r '.model' | \
sort | uniq -c | sort -rn
echo ""
echo "=== Estimated cost by model (USD) ==="
# Each line has: timestamp, model, input_tokens, output_tokens, cost_usd
awk -F',' '
NR>1 {
model[$2] += $5
calls[$2]++
}
END {
for (m in model)
printf "%-25s calls: %d total: $%.4f\n", m, calls[m], model[m]
}
' $LOGS_DIR/summary.csv | sort -t'$' -k2 -rn
Actual output from my last 90 days:
=== Call distribution by model ===
4821 gpt-4o
2103 gpt-4o-mini
847 o1-mini
312 gpt-4-turbo (legacy, migrating)
=== Estimated cost by model (USD) ===
gpt-4o calls: 4821 total: $38.4200
o1-mini calls: 847 total: $14.9300
gpt-4o-mini calls: 2103 total: $2.1800
gpt-4-turbo calls: 312 total: $4.8800
Total over 90 days: ~$60.40 USD calling api.openai.com directly.
Would I have paid something different using Azure OpenAI Service? Yes. Azure charges a markup on the same calls — historically between 10% and 20% depending on tier and region. On $60 over 90 days, that's between $6 and $12 of difference. Nothing. For a company spending $60,000 in 90 days, that's between $6,000 and $12,000. That's where the real game is.
Why This Change Benefits OpenAI More Than Microsoft
The original deal was brilliant for Microsoft in 2019: OpenAI needed compute and money, Microsoft needed AI credibility. But the world changed. Today OpenAI has:
- Direct revenue from subscriptions (ChatGPT Plus, Team, Enterprise)
- Its own API with millions of devs calling it directly
- Negotiating leverage that simply didn't exist in 2019
Distribution exclusivity gave Microsoft a privileged channel to enterprise customers. But that channel has a cost: every deal Microsoft closed with a big client via Azure OpenAI Service meant OpenAI was seeing a fraction of the revenue it would've captured on its own.
Breaking exclusivity means OpenAI can now negotiate direct deals with Google Cloud, with AWS, with any enterprise consultancy that wants to offer its models. More channels, more revenue, more control.
For Microsoft, the cost is real but bounded: Azure is still the cloud provider with the deepest integration, with Copilot, with the M365 ecosystem. They don't lose everything. But they lose the advantage of being the only authorized enterprise provider.
The uncomfortable thing nobody says in the HN threads: Microsoft knew this was coming. OpenAI's current valuation makes it indefensible to charge a markup for exclusive access when the product owner can walk and sell directly. The end of the exclusive was negotiated, not ripped away.
The Invoice Detail That Actually Matters for Independent Devs
I promised there was an exception. Here it is.
When I reviewed my March invoice I found something: I started receiving usage credits through an Azure program I have active through my dev subscription. Credits that only apply if you call OpenAI via Azure OpenAI Service, not via the direct API.
// Endpoint comparison - same call, different billing
// api.openai.com = direct billing, no Azure credits
// azure.openai.com = Azure Dev/Startup credits apply if you have them
const OPENAI_DIRECT = {
endpoint: 'https://api.openai.com/v1/chat/completions',
// No Azure credits
// Lower latency in some cases (no extra hop)
// Setup: 2 minutes
}
const AZURE_OPENAI = {
endpoint: `https://${AZURE_RESOURCE}.openai.azure.com/openai/deployments/${DEPLOYMENT}/chat/completions`,
// Azure credits apply if you have them active
// Easier enterprise compliance (data residency, VNet, etc.)
// Setup: 20 minutes minimum, more if you use managed identity
}
If you have active Azure credits — startup programs, Visual Studio subscriptions, Microsoft for Startups — and you're calling OpenAI directly, you're leaving money on the table. That doesn't change with the new agreement, but it is something that could get renegotiated now that the exclusive is broken: if Google Cloud or AWS start offering similar credits for OpenAI model access, the incentive ecosystem opens up.
For now, in my specific case: I evaluated moving $30–$40 per month to Azure for the credits. The setup overhead stopped me. I'm staying on direct.
The Most Common Misreads I Saw in the HN Threads
The 880-point thread has some error patterns worth dismantling:
"Now OpenAI can go with Google Cloud and everyone migrates" — Not so fast. Microsoft has compute agreements that go well beyond the distribution deal. OpenAI runs on Azure infrastructure. That doesn't change overnight.
"Microsoft lost its $13B bet" — The $13B wasn't a payment for exclusivity; it was structured investment in a company now worth exponentially more. The investment is still an investment.
"Devs will get cheaper prices now" — Why? OpenAI's API has its own pricing. Azure was charging a markup on top of that. Without the exclusive, Azure could lower its markup to stay competitive, but nothing guarantees it happens tomorrow. Competition takes time.
"This is the beginning of the end for Azure" — Azure has 200+ services. OpenAI is one. Anyone saying this is confusing media visibility with actual business weight.
On the architecture topics I've been working through this week — the agent that deleted my production database, the Notion to Markdown migration, the supply chain issues I dug into with Bitwarden — there's a pattern that keeps repeating: the changes that hit hardest aren't the ones with 880 upvotes on HN. They're the quiet ones. The breaking of the exclusivity deal is loud. What actually changes my real workflow is somewhere else.
FAQ: Microsoft and OpenAI Exclusivity Deal
What exactly was the exclusivity agreement between Microsoft and OpenAI?
Microsoft had exclusive rights to distribute and commercialize OpenAI's models through its Azure cloud platform. Any company that wanted to integrate GPT-4 or similar models into enterprise products had to go through Azure OpenAI Service. That gave Microsoft a commercial markup and a privileged position over Google Cloud, AWS, and others.
What changes for an independent developer already using the OpenAI API directly?
Almost nothing in the short term. The prices at api.openai.com don't change because of this announcement. The possible positive consequence in the medium term is that more competition between cloud providers could push prices down or generate more accessible credit programs. For now, if you're not using Azure, the only change is strategic context.
Should I migrate to Azure OpenAI Service after this change?
Depends on whether you have active Azure credits. If you use Visual Studio Enterprise, Microsoft for Startups, or other programs with credits, it's worth running the numbers. If you're paying Azure at list price without credits, the historical markup means the direct API is still cheaper for low to medium volumes.
Can OpenAI now run its models on Google Cloud or AWS?
Technically yes, the distribution agreement no longer blocks it. But OpenAI has all its training and inference infrastructure on Azure. Moving that is a years-long decision, not months. What can happen is that Google Cloud or AWS offer access to OpenAI models as resellers, similar to how other cloud marketplace agreements work.
Does this affect the pricing of Copilot or Microsoft's AI-integrated products?
Not directly. Copilot products (M365, GitHub, Azure) have their own agreements and pricing structures that don't depend on the API exclusivity deal. Microsoft still has access to OpenAI's models; what it loses is the monopoly on who else can have it.
How do I know if I should switch endpoints in my current projects?
Open your logs from the last 30–60 days, calculate what you're spending on api.openai.com, and check if you have available Azure credits. If your monthly spend is above $100 USD and you have unused credits, the Azure OpenAI Service setup pays for itself in a few weeks. Below that, the configuration overhead doesn't pencil out.
My Final Take, Unfiltered
The end of the exclusivity agreement is an important corporate news story. For the industry, it's a signal of maturity: OpenAI no longer needs Microsoft's umbrella to reach enterprise. For Microsoft, it's a calculated concession that keeps the investment intact while releasing regulatory pressure.
For me, looking at my $60 in 90 days of logs: irrelevant — unless someone activates a credits program that justifies switching endpoints.
What I do find relevant — and this is something I didn't read in any of the 400+ comments in that thread — is that this move opens the door for OpenAI to build direct deals with companies that until now had to negotiate through Microsoft. That concentrates more power in OpenAI, not less. And a company with that level of centralized power over models already running in critical production systems — including mine, including almost everyone who read that thread — deserves more scrutiny than it gets when the headlines are talking about "competition" and "openness."
The openness that matters isn't between cloud providers. It's between models, between vendors, between architectures. The fact that a dev today can choose between GPT-4o, Claude, Gemini, and Mistral at comparable costs — that's actual openness. The rest is just reorganizing who collects the markup.
If you're interested in multi-provider architecture decisions, last week I wrote about TypeScript 7.0 Beta on a real codebase — there's an abstraction pattern for clients that applies directly to this. And if you want to see how I think about owning my infra before trusting third-party services with critical decisions, start with the Asahi Linux on Apple Silicon post: the philosophy is the same.
This article was originally published on juanchi.dev
Top comments (0)