DEV Community

Guillaume Sere
Guillaume Sere

Posted on

🚀 I Got Tired of AI APIs… So I Built My Own SDK in 24h (multi-ai-client)

Working with AI APIs
is powerful…
but let’s be honest:

👉 it’s also a mess.

  • OpenAI → one format
  • Claude → another
  • Gemini → completely different
  • Mistral → again something else

Every time you switch provider, you rewrite everything.

So yesterday, I decided to fix that.

🧠 The Idea

What if you could use one single API for all AI providers?
No matter if it's:

  • OpenAI
  • Claude (Anthropic)
  • Gemini
  • Mistral

👉 Same code. Same interface. Same logic.

⚡ Meet multi-ai-client

A lightweight unified AI SDK for JavaScript & TypeScript.

🔥 The Goal

Instead of this 👇

// different code for each provider 😩

You do this 👇

import { AIClient } from "multi-ai-client"

 const ai = new AIClient({ 
apiKey: "YOUR_API_KEY",
 provider: "openai"
 }) 

const res = await ai.chat("Explain AI simply")
 console.log(res)
Enter fullscreen mode Exit fullscreen mode

💥 Done.

🔌 Switching Providers = 1 Line

const ai = new AIClient({ 
apiKey: "YOUR_API_KEY",
 provider: "anthropic"
 })
Enter fullscreen mode Exit fullscreen mode

👉 No rewrite. No headache.

⚡ Streaming (Real-Time Output)

Modern AI apps need streaming.
Here’s how simple it is:

for await (const chunk of ai.stream("Tell me a story")) { process.stdout.write(chunk.content) }
Enter fullscreen mode Exit fullscreen mode

🧠 Clean & Unified Output

Every provider returns the same format:

{ 
content: "Hello",
 done: false
 }
Enter fullscreen mode Exit fullscreen mode

👉 No parsing nightmares.

💡 Why This Matters

If you’ve built with AI APIs, you know:

  • switching providers = pain
  • maintaining multiple integrations = worse
  • streaming = inconsistent

👉 This SDK removes all that friction.

🚀 Key Features

  • 🤖 Multi-provider support (OpenAI, Claude, Gemini, Mistral)
  • ⚡ Unified API (chat() + stream())
  • 🔄 Real-time streaming (AsyncIterator)
  • 🧠 TypeScript support
  • 📦 Lightweight
  • 🔌 Extensible architecture

⚙️ Advanced Usage

await ai.chat("Explain AI", {
 temperature: 0.7,
 maxTokens: 500 
})
Enter fullscreen mode Exit fullscreen mode

🏗️ Under the Hood

AIClient 
├── OpenAI
├── Anthropic
├── Gemini
└── Mistral
Enter fullscreen mode Exit fullscreen mode

👉 Each provider is isolated but exposed through one clean interface.

🔥 Why I Built This

I wanted something:

  • simple
  • fast
  • clean
  • reusable across projects

👉 Basically: a tool I wouldn’t hate using

📦 Try It Yourself

npm install multi-ai-client
Enter fullscreen mode Exit fullscreen mode

🧪 Real Use Case

Switch providers dynamically:

const providers = ["openai", "anthropic", "gemini"]

 for (const provider of providers) {
 const ai = new AIClient({ apiKey: "...", provider })
 console.log(await ai.chat("Explain quantum computing"))
 }
Enter fullscreen mode Exit fullscreen mode

🔮 What’s Coming Next

  • smarter streaming (token-level)
  • retry system
  • rate limiting
  • plugin system
  • caching

💬 Final Thought

AI is evolving fast.

But the developer experience? Not always.

👉 multi-ai-client is my attempt to fix that.

Link: https://www.npmjs.com/package/multi-ai-client
Git: https://github.com/GuillaumeSere/multi-ai-client

If you find this useful, feedback is welcome 🙌

Top comments (0)