DEV Community

Cover image for 🍳 From Convex's Kitchen to OpenRouter's Gateway: My Journey Integrating 200+ AI Models into Chef

🍳 From Convex's Kitchen to OpenRouter's Gateway: My Journey Integrating 200+ AI Models into Chef

How I replaced Chef's complex provider system with a single gateway to 200+ AI models


What is Chef?

Chef is Convex's AI coding assistant that goes beyond simple code completion. Unlike other tools that just suggest the next line, Chef understands your entire application context and can help build full-stack features from database schemas to UI components.

What sets Chef apart:

  • Real-time collaboration with AI on complete applications
  • Built-in deployment through Convex's infrastructure
  • Context-aware assistance that knows your project structure
  • Full-stack capabilities spanning frontend and backend

The big news? Convex recently open-sourced Chef, giving developers complete control over their AI development workflow.


Why Open-Sourcing Chef Matters

Convex's decision to open-source Chef wasn't just about sharing code. It represents a shift in how we think about AI development tools. Instead of being locked into a vendor's ecosystem, developers can now:

  • Control their infrastructure - deploy on their own terms
  • Customize the experience - modify the tool to fit their workflow
  • Choose their AI providers - no artificial limitations on model access
  • Contribute improvements - help shape the tool's future

This puts Chef alongside other successful open-source developer tools that have become industry standards through community adoption rather than corporate mandate.


The Problem: Provider Lock-In

Diving into Chef's codebase, I found it was built around four specific AI providers: OpenAI, Anthropic, Google, and xAI. Each provider required its own integration logic, API handling, and configuration. While these are excellent models, this approach had limitations.

The architecture meant that adding new providers required significant code changes. More importantly, it limited users to whatever models the Chef team had time to integrate. What about the dozens of other innovative models appearing constantly?

This led me to OpenRouter, which solves this problem elegantly. Instead of integrating with dozens of providers individually, OpenRouter provides a single API that routes to over 200 models from various providers:

  • Anthropic's Claude models (3.5 Sonnet, Haiku, Opus)
  • Meta's Llama family (including fine-tuned variants)
  • Google's Gemini series
  • Experimental models from smaller labs
  • Cost-optimized alternatives for different use cases

The Technical Implementation

I rebuilt Chef's provider system from scratch to work exclusively through OpenRouter. This required changes across the entire stack, from the database schema to the UI components.

Dynamic Model Discovery

The original Chef used hardcoded model lists. My implementation fetches available models dynamically:

// Before: Static model list
const models = ['gpt-4', 'claude-3-sonnet', 'gemini-pro'];

// After: Dynamic API-driven discovery
const models = await fetch('https://openrouter.ai/api/v1/models');
Enter fullscreen mode Exit fullscreen mode

This means new models appear automatically without code changes.

selection of models

Real-time Model Sync

I implemented a caching system that:

  • Fetches model data every few hours via cron jobs
  • Stores model metadata in Convex for fast access
  • Provides a manual refresh button for instant updates
  • Handles API failures gracefully with cached fallbacks

Flexible API Key Management

The system supports multiple authentication scenarios:

  • Environment variables for production deployments
  • User-provided API keys for personal accounts
  • Automatic fallback between user and system keys
  • Secure key validation before saving

Enhanced User Experience

The UI now displays:

  • Real-time pricing information (cost per 1M tokens)
  • Model descriptions and capabilities
  • Search and filtering across 200+ models
  • Visual indicators for recommended models

Architecture Overview

The new system has four main components working together:

Model Fetcher: A Convex action that queries OpenRouter's API to discover available models and their metadata (pricing, context length, capabilities).

Convex Cache: Stores model information locally to avoid API calls on every page load. This cache refreshes automatically and can be manually triggered.

Provider Adapter: Translates Chef's requests into OpenRouter's format. Since OpenRouter uses an OpenAI-compatible API, this was simpler than expected.

UI Components: Dynamic interfaces that let users browse, search, and select from hundreds of models with real-time pricing information.


Code Simplification

The difference in complexity is dramatic. Here's what provider management looked like before:

// Managing multiple providers
if (provider === 'openai') { /* OpenAI logic */ }
else if (provider === 'anthropic') { /* Anthropic logic */ }
else if (provider === 'google') { /* Google logic */ }
// ... and so on
Enter fullscreen mode Exit fullscreen mode

Now it's much cleaner:

// One provider, infinite possibilities
const provider = createOpenAI({
  baseURL: 'https://openrouter.ai/api/v1',
  apiKey: userApiKey || process.env.OPENROUTER_API_KEY
});

const model = provider(selectedModelId); // Any model!
Enter fullscreen mode Exit fullscreen mode

What This Enables

Model Flexibility: Developers can switch between models based on task requirements. Use GPT-4 for complex reasoning, Llama for simple tasks, or Claude for long-form content.

Cost Optimization: Real-time pricing comparison lets you choose the most cost-effective model for each use case. Some tasks don't need the most expensive model.

Future-Proofing: New models appear automatically. When a provider releases something new on OpenRouter, it shows up in Chef without any code changes.

Experimentation: Easy access to experimental models from smaller labs lets developers try cutting-edge capabilities without complex integrations.


Try It Yourself

If you want to test this integration, here's how to get started:

Get the Code

git clone https://github.com/turazashvili/chef-openrouter.git
cd chef-openrouter
git checkout openrouter-integration
Enter fullscreen mode Exit fullscreen mode

Set Up Your Environment

# Get your OpenRouter API key (free tier available)
echo "OPENROUTER_API_KEY=your_key_here" > .env.local

# Install and run
npm install
npm run dev
Enter fullscreen mode Exit fullscreen mode

What You'll See

  • A model selector with 200+ options
  • Real-time pricing for each model
  • Instant model switching without restart
  • The same Chef experience with unlimited model choice

Why This Matters

This isn't just about having more model options. It represents a shift toward more open, flexible AI development tools.

Instead of being locked into whatever models a vendor chooses to support, developers get:

  • Transparent pricing across providers
  • Access to experimental and specialized models
  • The ability to optimize for different use cases
  • Freedom from vendor lock-in

Contributing

Since Chef is open-source, anyone can contribute. I'm planning to submit this OpenRouter integration as a pull request to the main repository. If it gets merged, every Chef user would have access to this expanded model selection.

The code is available on my fork if you want to review the implementation or suggest improvements. I tried to keep the changes minimal while maximizing the benefit.


What's Next

I'll be submitting this integration back to the main Chef repository soon. The goal is to give every Chef user access to the full spectrum of available AI models without the complexity of managing multiple providers.

Links:

  • GitHub:

    GitHub logo turazashvili / chef-openrouter

    The only AI app builder that knows backend (Chef with OpenRouter configuration)

    Chef by Convex'

    Chef is the only AI app builder that knows backend. It builds full-stack web apps with a built-in database, zero config auth, file uploads real-time UIs, and background workflows. If you want to check out the secret sauce that powers Chef, you can view or download the system prompt here.

    Chef's capabilities are enabled by being built on top of Convex, the open-source reactive database designed to make life easy for web app developers. The "magic" in Chef is just the fact that it's using Convex's APIs, which are an ideal fit for codegen.

    Development of the Chef is led by the Convex team. We welcome bug fixes and love receiving feedback.

    This project is a fork of the stable branch of bolt.diy.

    Getting Started

    Visit our documentation to learn more about Chef and check out our prompting guide.

    The easiest way to build with…



  • Branch: openrouter-integration

Thanks to the Convex team for open-sourcing Chef, and to OpenRouter for creating such a useful abstraction layer. The combination makes AI development tools more accessible and flexible.

Top comments (0)