DEV Community

Cover image for The Switzerland of AI Tooling: Inside TanStack AI’s Bold New Approach
MUHAMMAD USMAN AWAN
MUHAMMAD USMAN AWAN

Posted on

The Switzerland of AI Tooling: Inside TanStack AI’s Bold New Approach

TanStack AI: The Open-Source, Framework-Agnostic AI SDK Developers Have Been Waiting For

Introduction

TanStack AI is a new open-source SDK designed to help developers build AI-powered applications in any JavaScript environment. Whether you’re using React, Solid, a vanilla JS setup, or even a backend framework like Node, PHP, or Python, TanStack AI plugs into your existing stack without forcing you into a new platform or workflow.

Announced in late 2025 (currently in alpha), TanStack AI aims to solve a major issue in the AI ecosystem today: vendor lock-in. Instead of pushing developers into a walled garden, TanStack AI brands itself as the “Switzerland of AI tooling”—neutral, transparent, and fully open-source. It provides a type-safe, unified interface across providers like OpenAI, Anthropic, Google Gemini, and Ollama, giving developers total flexibility and control.


Why TanStack AI?

No Vendor Lock-In

Most AI platforms tie you tightly to a single provider. TanStack AI does the opposite: it lets you connect directly to the providers you choose. No middlemen. No platform fees. No proprietary formats. No enforced migrations.

TanStack’s philosophy is simple:
Developers deserve control—not upsells or hidden constraints.


Core Features

1. Multi-Provider Support & Framework Agnostic

TanStack AI works with major LLM providers out of the box and can switch models at runtime with zero code changes. The SDK is fully environment-agnostic:

  • Works in the browser or Node
  • Works with React, Solid, Vue, Next.js, Express, Remix, or TanStack Start
  • Supports multiple languages, including experimental PHP and Python adapters

2. End-to-End Type Safety

Every function, model option, and tool definition is strongly typed using TypeScript and Zod. Your IDE understands exactly what your inputs and outputs should look like, catching errors at compile time instead of during production.

3. Isomorphic Tools & Automatic Execution

One of the standout innovations is TanStack AI’s isomorphic tool system:

  • Define a tool once with toolDefinition()
  • Attach .server() or .client() implementations
  • The SDK automatically executes the correct version during a chat session
  • Built-in agentic loops mean the LLM can call tools autonomously
  • Optional approval workflows let users confirm tool calls for safety

This turns AI interactions into dynamic, interactive workflows without manual glue code.

4. Real-Time Streaming & Multimodal Output

TanStack AI includes first-class support for streaming:

  • Receive partial tokens immediately
  • View intermediate “thinking” tokens for debugging
  • Stream multiple content types (text, code, images, audio, JSON, etc.)

This creates fast, interactive chat experiences with maximum transparency.

5. AI DevTools for Debugging

Built on TanStack DevTools, this panel provides:

  • Full message inspection
  • Tool call tracing
  • Token streaming visualizations
  • Replay capabilities

It’s one of the most advanced debugging tools in the AI library ecosystem today.

6. Pure Open Source Ecosystem

TanStack AI is not a hosted service. It’s a collection of libraries maintained by the TanStack community and backed by sponsors like Cloudflare and Prisma. You own your data and vendor choices.


Integration with the TanStack Ecosystem

React, Solid, and Framework-Agnostic Clients

TanStack AI provides official packages such as:

  • @tanstack/ai-react → with a powerful useChat hook
  • @tanstack/ai-solid → Solid-friendly chat management
  • @tanstack/ai-client → Vanilla JS and framework-agnostic

These make rendering chat UIs and managing AI state extremely simple.

Server-Side Flexibility

The server layer is fully agnostic and works with:

  • Node / TypeScript
  • PHP (Slim)
  • Python (FastAPI)
  • Next.js / TanStack Start / Express

Adapters handle protocol details so you can focus on building features.

Examples and Starter Projects

The TanStack team provides real-world examples, including:

  • Full-stack React & Solid apps using TanStack Start
  • PHP Slim + vanilla JS integrations
  • Python FastAPI backend examples
  • Multi-user group chat with websockets

These examples serve as excellent blueprints for your own applications.

Adapters and Extensibility

First-party adapters include:

  • @tanstack/ai-openai
  • @tanstack/ai-anthropic
  • @tanstack/ai-gemini
  • @tanstack/ai-ollama

If you use a niche or self-hosted model, building a custom adapter is straightforward.


Getting Started

Installation Example (React + OpenAI)

npm install @tanstack/ai @tanstack/ai-react @tanstack/ai-openai
Enter fullscreen mode Exit fullscreen mode

Basic Server Chat Example

import { chat, toStreamResponse } from "@tanstack/ai";
import { openai } from "@tanstack/ai-openai";

export async function POST(req: Request) {
  const { messages, conversationId } = await req.json();
  const stream = chat({
    adapter: openai(),
    messages,
    model: "gpt-4o"
  });
  return toStreamResponse(stream);
}
Enter fullscreen mode Exit fullscreen mode

Basic React Client Example

import { useChat, fetchServerSentEvents } from "@tanstack/ai-react";

function Chat() {
  const { messages, sendMessage } = useChat({
    connection: fetchServerSentEvents("/api/chat")
  });
  // ...
}
Enter fullscreen mode Exit fullscreen mode

With these two pieces, you already have a fully working streaming chat.


Defining and Using Tools

What Are Tools?

Tools are functions the LLM can call during a conversation—like searching a database, fetching products, or running calculations.

Declaring a Tool

const getProductsDef = toolDefinition({
  name: 'getProducts',
  inputSchema: z.object({ query: z.string() }),
  outputSchema: z.array(z.object({ id: z.string(), name: z.string() })),
});
Enter fullscreen mode Exit fullscreen mode

Server Implementation

const getProducts = getProductsDef.server(async ({ query }) => {
  return await db.products.search(query);
});
Enter fullscreen mode Exit fullscreen mode

Using Tools in Chat

chat({
  model: "gpt-4o",
  messages: [...],
  tools: [getProducts]
});
Enter fullscreen mode Exit fullscreen mode

When the AI triggers the tool, the SDK executes it automatically—no manual routing needed.


Advanced Capabilities

Streaming & Real-Time Updates

Stream tokens, reasoning steps, and even non-text payloads as they’re generated.

Agentic Flows & Approval

Enable multi-step workflows where the LLM:

  1. Calls tools
  2. Evaluates results
  3. Decides the next action

Add user approval for safer execution.

Multimodal Outputs

Handle text, images, audio, video, code blocks, JSON, and more—all in a type-safe way.

Summarization & Embeddings

Server libraries include convenient helpers for common NLP tasks like summarization or embedding generation.

AI DevTools

Diagnose tool calls, watch token streams, and replay events for debugging.


Key Differentiators

1. 100% Open Source

No fees, no platform lock-in, no hosted middle layer.

2. Multi-Provider Freedom

Switch from OpenAI to Anthropic, Gemini, or a self-hosted model by simply swapping adapters.

3. Strong Type Safety

TypeScript + Zod = incredibly robust, predictable API interactions.

4. Deep Framework Integrations

Use React, Solid, or any environment without rewriting your app.

5. Community-Driven

Built by the creators of TanStack Query and Router—trusted names in the JS ecosystem.


Real-World Use Cases

Chatbots and Assistants

Build full UI chat experiences, customer support bots, or internal assistants.

AI-Augmented Web Apps

Enhance existing apps with product search, recommendations, or automated help.

RAG & Data Processing

Use embeddings, summarization, or document retrieval with fully typed tools.

Multi-User Apps

Create collaborative group assistants or AI-enhanced team tools.

Extend Existing Backends

Expose REST APIs or databases as tools the AI can call.


Open Source and Community

The TanStack AI project is hosted publicly on GitHub with an open invitation for contributions. The team encourages developers to give feedback, suggest features, or contribute adapters.

The ecosystem includes:

  • GitHub repository
  • Active Discord community
  • Backing from major tech companies
  • Frequent updates during the alpha phase

By adopting TanStack AI, you’re joining a vibrant open-source effort shaping the future of developer-friendly AI tooling.


Conclusion

TanStack AI is a major step toward transparent, flexible, and open AI development. With its type-safe tooling, framework-agnostic design, and multi-provider support, it offers a refreshing alternative in a landscape dominated by proprietary platforms.

Even in alpha, it’s remarkably capable—and now is the perfect time to experiment, provide feedback, or contribute.


Next Steps

  • Read the Quick Start Guide
  • Explore the Tools Guide for advanced workflows
  • Clone the example apps
  • Join the TanStack Discord
  • Contribute to the GitHub repo if you have ideas or want to help

Whether you’re building a chatbot, automating internal processes, or adding AI features to an existing app, TanStack AI is designed to fit seamlessly into your workflow.

const signOffMessage = {
  title: "Thanks for reading! 🙌",
  vibe: "Until next time, 🫡",
  author: {
    name: "Usman Awan",
    tagline: "your friendly dev 🚀",
  },
  message: "Stay curious and keep building?",
};

export default signOffMessage;
Enter fullscreen mode Exit fullscreen mode

Top comments (0)