OpenAI Codex CLI on Amazon Bedrock Models: Why Bother?
Here’s why Codex plus the Amazon Bedrock models make sense under some cases:
- Pay as you go: No fixed cost—just pay for Bedrock tokens and Lambda invocations. No monthly minimum while Amazon Q Developer CLI has a free-tier quota, you must upgrade to the $19 USD/month paid plan if you have breached it, which still enforces usage caps.
- Use your own fine-tuned models: Swap model endpoints easily; the gateway can even route to your own Amazon Bedrock fine-tunes (e.g. Nova) without friction.
- Transparent logging: Codex’s request/response logs give you full visibility — a plus for debugging and cost tracking.
- No AWS IAM/Identity required - Perfect for Headless Workloads: You only need your Bedrock Access Gateway API key; no need to log into your AWS identities with an inconvenient console authentication with your AWS Identity Center user/Builder ID (great for CI/CD and ephemeral cloud instances).
- Regional flexibility: Yes, you could use Claude Code with Amazon Bedrock, but then I live in Hong Kong where Claude model usage is not allowed.
- Amazon Nova Micro: Price King: For pure simple text LLM tasks, swapping Sonnet 4 for Nova Micro cuts costs by a factor of 85 — Comparing between Nova Micro and Sonnet 4.
- If you have a bunch of AWS Credits from AWS events - you're cover with your usages with
gpt-oss
/ Nova family of models!
Setup: Codex CLI + Bedrock Gateway
Get your Lambda Gateway Function URL and API Key after deployment. (Check my earlier article for a step-by-step guide to get it running on Lambda via AWS SAM: https://dev.to/aws-builders/use-amazon-bedrock-models-via-an-openai-api-compatible-serverless-endpoint-now-without-fixed-cost-5hf5)
Here's a no-brainer if you want to skip my article and deploy it right away:
(
cd /tmp && \
git clone --depth=1 https://github.com/gabrielkoo/bedrock-access- gateway-function-url && \
cd bedrock-access-gateway-function-url && \
./prepare_source.sh && \
sam build && \
sam deploy --guided
)
Now install Codex:
npm i -g @openai/codex
Configure Codex like so:
# ~/.codex/config.toml
profile = 'bedrock'
[profiles.bedrock]
model = 'openai.gpt-oss-120b-1:0'
# OR
# model = 'us.amazon.nova-premier-v1:0'
model_provider = 'bedrock'
model_reasoning_effort = "low"
[model_providers.bedrock]
name = 'bedrock'
base_url = 'https://RANDOM_HASH_HERE.lambda-url.AWS_REGION.on.aws/api/v1'
env_key = 'CODEX_OPENAI_API_KEY'
Alternatively, if you want to stick to only gpt-oss
models but not e.g. Claude/Nova families of models, you can use the latest official OpenAI compatible endpoint with an Amazon Bedrock API Key instead - there will be no need to host the Bedrock Access Gateway:
...
[model_providers.bedrock]
name = "AmazonBedrock"
base_url = "https://bedrock-runtime.us-west-2.amazonaws.com/openai/v1"
env_key = "ENV_KEY_FOR_YOUR_BEDROCK_API_KEY"
Query the LLM:
codex --profile bedrock "What is my public IP address?"
Model Support
Note that not all Bedrock models work over the gateway. Models must support tool calls.
GPT OSS (20b/120b): Optimzied with Codex
Nova family (Premier, Pro, Lite, Micro): All tested and working.
Claude, Llama, Mistral, Command R: Working, subject to regional restrictions (e.g. Hong Kong).
Amazon Q Developer CLI vs Codex CLI on Bedrock
Amazon Q Developer CLI is indeed officially supported in Hong Kong — but after your free usage (50 agentic chats/month), you'll need the $19/month paid plan, and may hit quotas even then.
Codex CLI via Amazon Bedrock gives unmetered usage (subject to whatever quotas you have on Amazon Bedrock itself and Lambda), no AWS login required - you just need to prepare the API key that you defined your self when you deployed the Bedrock Access Gateway.
Why Don't I Just Use the new OpenAI Compatible Endpoint?
Refer to my other blog article AWS Launches OpenAI-Compatible API for Bedrock (and I Did Some Tests!), the new OpenAI compatible Amazon Bedrock API endpoint supports gpt-oss
20b as well as 120b out of the box, other models like Nova or Claude are not supported.
So with my solution of wrapping the calls via a Bedrock Access Gateway, you can switch to other models whenever you want according your choice.
Summary
Codex CLI + Amazon Bedrock (via OpenAI-compatible gateway) gives developers a way to use pay-as-you-go agentic CLI Agents, swap fine-tuned models easily, and avoid region/pricing issues present in other AWS or Anthropic toolings. For minimal cost, Nova Micro is unbeatable for text workloads. And yes, my serverless gateway solution is the backbone — but more about that in my previous blog!
Top comments (0)