DEV Community

Hugues Chocart
Hugues Chocart

Posted on

Abso: The Universal TypeScript SDK for AI - One Interface, All LLMs 🐠

Abso banner image

Ever wished you could switch between AI providers without rewriting your code? Meet Abso, an innovative TypeScript SDK that unifies multiple LLM providers behind a single, OpenAI-compatible interface. Whether you're using GPT-4, Claude, or any other LLM, Abso keeps your code clean, typed, and future-proof.

⭐️ If you like type-safe AI integration, consider starring our repo to support the project! ⭐️

Why Developers Love Abso

  • Write Once, Run Anywhere 🎯: Switch between OpenAI, Anthropic, or any other provider with a single line change. Format difference (for example with tool calls) are automatically handled for you.
  • 100% Type-Safe πŸ›‘οΈ: Full TypeScript support with auto-completion for all methods, requests, and responses
  • Lightweight & Blazing Fast πŸͺΆ: Zero unnecessary dependencies, minimal overhead
  • Streaming support 🌊: Both event-based and async iteration streaming support out of the box
  • Extensible by Design πŸ”Œ: Easily add new providers as they emerge
  • Built-in Embeddings support πŸ”: Ready for semantic search and advanced text analysis
  • Smart Token Management πŸ“Š: Accurate token counting and cost estimation included

Getting Started in 30 Seconds

Gif

First install Abso: npm install abso-ai

Then start chatting with AI:

import { abso } from "abso-ai";

const result = await abso.chat.create({
  messages: [{ role: "user", content: "What's the meaning of life?" }],
  model: "gpt-4",
});

console.log(result.choices[0].message.content);
Enter fullscreen mode Exit fullscreen mode

Advanced Features

Provider Selection

Explicitly choose your preferred provider:

const result = await abso.chat.create({
  messages: [{ role: "user", content: "Hello!" }],
  model: "openai/gpt-4",
  provider: "openrouter",
});
Enter fullscreen mode Exit fullscreen mode

Streaming Support

Get real-time responses:

const stream = await abso.chat.stream({
  messages: [{ role: "user", content: "Tell me a story..." }],
  model: "gpt-4",
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0].delta.content || "");
}
Enter fullscreen mode Exit fullscreen mode

Why Star ⭐️ Abso?

  1. Stay Updated: Be the first to know about new features and providers
  2. Support Open Source: Help us make AI integration easier for everyone
  3. Join the Community: Connect with other developers building with AI
  4. Shape the Future: Your star helps us understand what the community needs

Join Our Community

Coming Soon

  • More providers and models
  • Advanced caching mechanisms
  • Cost optimization features
  • Enhanced error handling
  • And much more!

If you find Abso useful, please consider giving us a ⭐️ on GitHub. It helps us know what the community needs and motivates us to keep improving!

Top comments (1)

Collapse
 
vincelwt profile image
Vince Lwt

Yes! Abso is really cool πŸ”₯