The AI vendor lock-in problem just got real
Yesterday, Amazon announced OpenAI models are coming to Amazon Bedrock. That means GPT-4o, o1, and friends will be available through AWS's managed AI service — alongside Claude, Llama, Mistral, and a dozen others.
On the surface this sounds like more choice. More competition. Good for developers, right?
Let me explain why it's actually the opposite.
What Bedrock is, and what it costs you
Amazon Bedrock is AWS's "model as a managed service" layer. You pay AWS's markup on top of the model's base API cost, you get AWS's billing, and your usage gets bundled into your AWS bill.
For enterprise procurement teams, this is appealing — one vendor, one invoice, existing AWS credits.
For individual developers and small teams in emerging markets, it's a disaster:
- AWS pricing is in USD — no local currency, no local payment methods
- Minimum viable setup requires AWS account + IAM + Bedrock access — 3 friction layers before your first API call
- Bedrock adds a per-token markup on top of Anthropic/OpenAI base pricing
- No flat monthly rate — you're paying per-token, which means unpredictable bills
The consolidation trap
Here's the pattern I've been watching:
- Model labs release APIs directly (cheap, direct, low friction)
- Cloud hyperscalers (AWS, Azure, GCP) offer the same models as managed services
- Enterprise buyers prefer the managed service (IAM, compliance, one vendor)
- Model labs quietly deprioritize their direct API tier for individual developers
- Individual developers are left paying hyperscaler markup or switching models
We've seen this with Azure OpenAI (GPT-4 is technically cheaper through Azure for enterprise, more complex for individuals). We're seeing it again with Bedrock.
OpenAI joining Bedrock is the clearest signal yet that OpenAI's primary customer is AWS enterprise procurement, not the individual developer.
What this means practically for your code
If you're currently using the OpenAI Python SDK directly:
from openai import OpenAI
client = OpenAI(api_key="your-key")
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}]
)
The Bedrock version of this same call requires:
import boto3
import json
client = boto3.client('bedrock-runtime', region_name='us-east-1')
body = json.dumps({
"messages": [{"role": "user", "content": "Hello"}],
"max_tokens": 1024,
"anthropic_version": "bedrock-2023-05-31"
})
response = client.invoke_model(
modelId="anthropic.claude-3-5-sonnet-20241022-v2:0",
body=body
)
result = json.loads(response['body'].read())
That's not a simplification. That's AWS-ification — more boilerplate, more IAM dependencies, more AWS lock-in.
The flat-rate alternative
The consolidation happening at the enterprise level is creating an opportunity at the individual developer level.
If your use case is:
- Personal projects
- Small team tools
- Side projects and experiments
- Building in markets where AWS billing is a barrier
...then a flat ✌️2/month API-included tier makes more sense than pay-per-token Bedrock markup.
SimplyLouie runs on Anthropic's Claude API, charges a flat rate, accepts local payment methods in 10+ countries, and doesn't require an AWS account to get your first API call working.
For Filipino developers (₱112/month), Indonesian developers (Rp32,000/month), Indian developers (Rs165/month) — the Bedrock announcement changes nothing. AWS USD billing was never accessible anyway.
The deeper question
When OpenAI joined Bedrock, the headline was "more AI choice on AWS."
The real story is: AI infrastructure is consolidating into three hyperscalers, and the individual developer is being priced and complexity-moated out of the market.
The counter-trend — flat-rate, local-currency, low-friction AI APIs — is the most important thing happening in AI accessibility right now. It just doesn't get the press.
Building in Nigeria, Philippines, Indonesia, Kenya, India, or Brazil? SimplyLouie.com has local pricing and a 7-day free trial. No AWS account required.
Top comments (0)