DEV Community

Cover image for GoAI - A Clean, Multi-Provider LLM Client for Go
Dariush Abbasi
Dariush Abbasi

Posted on

GoAI - A Clean, Multi-Provider LLM Client for Go

If you’re building AI-powered features in Go, you’ve probably noticed a pattern:

Every LLM provider has slightly different APIs, request formats, error handling, and configuration styles. Switching models often means rewriting glue code instead of focusing on your product.

That’s where GoAI comes in.

GoAI is a lightweight Go library that gives you one unified interface for chatting with multiple LLM providers — without pulling in heavy dependencies or hiding important details.


What is GoAI?

GoAI is a minimal, dependency-free Go library for building chat-based applications with modern LLMs.

Its design goals are simple:

  • One clean API across providers
  • Native Go patterns (context, errors, options)
  • No external dependencies
  • Easy to extend

If you like libraries that do one thing well, this one will feel right at home.


Supported LLM Providers

Out of the box, GoAI supports a growing list of popular providers:

  • OpenAI (GPT-4o, GPT-4-Turbo, etc.)
  • Anthropic (Claude)
  • Google Gemini
  • xAI (Grok)
  • Mistral
  • Perplexity
  • Ollama (for local models)

Switching providers is mostly a configuration change, not a rewrite.


Installation

go get github.com/dariubs/goai
Enter fullscreen mode Exit fullscreen mode

That’s it — no transitive dependency explosion.


Basic Usage

Here’s how simple it is to send a chat prompt:

client, err := goai.New(
    "openai",
    "gpt-4o",
    goai.WithAPIKey(os.Getenv("OPENAI_API_KEY")),
)
if err != nil {
    log.Fatal(err)
}

ctx := context.Background()
response, err := client.Chat(ctx, "Hello from Go!")
if err != nil {
    log.Fatal(err)
}

fmt.Println(response.Content)
Enter fullscreen mode Exit fullscreen mode

The same Chat call works across providers — only the provider name and model change.


Why GoAI Instead of Provider SDKs?

1. A Consistent API

Every provider has its own quirks. GoAI normalizes them into a single, predictable interface.

2. First-Class Context Support

Timeouts, cancellations, and request lifetimes are handled the Go way — via context.Context.

3. Typed Errors

You’re not stuck string-matching error messages. GoAI exposes structured errors you can reason about.

4. Zero Dependencies

No HTTP wrappers, no magic middleware, no surprises. Just the Go standard library.


Configuration via Options

GoAI uses functional options for configuration:

client, _ := goai.New(
    "anthropic",
    "claude-3-opus",
    goai.WithAPIKey(os.Getenv("ANTHROPIC_API_KEY")),
    goai.WithTimeout(30*time.Second),
    goai.WithTemperature(0.7),
)
Enter fullscreen mode Exit fullscreen mode

Clean, explicit, and easy to extend.


Great Fit For

  • AI-powered Go services
  • Prototyping across multiple LLMs
  • Switching providers without refactors
  • Internal tools and CLIs
  • Local + cloud model workflows (via Ollama)

Extending goai

Want to add a new provider?

The codebase is intentionally small and readable. Implement the provider interface, plug it in, and you’re good to go. No framework archaeology required.


Final Thoughts

GoAI doesn’t try to be clever — and that’s its strength.

If you’re a Go developer who wants:

  • control instead of magic,
  • clarity instead of abstractions,
  • and flexibility without lock-in,

then GoAI is worth a look.

GoAI on GitHub

Top comments (0)