DEV Community

Cover image for I Built an AI-Powered Router That Understands What Your Users Mean (Not What They Type)
Van1s1mys
Van1s1mys

Posted on

I Built an AI-Powered Router That Understands What Your Users Mean (Not What They Type)

What if your search bar could understand "how do I reach support?" and route to /contact - without a single keyword rule?

I built AI Router - an open-source lib that runs a HuggingFace embedding model inside a Web Worker in the browser. No API keys, no backend, no latency. Just semantic understanding.

Live demo: ai-router-search.vercel.app
GitHub: IvanMalkS/ai-router


The Problem

Classic search in SPAs is keyword-based. User types "pricing", you match it to /pricing. Easy.

But what happens when they type:

  • "how much does it cost?"
  • "can I get a free trial?"
  • "tariffs and plans"

None of these have the word "pricing" in them. Keyword search returns nothing. User bounces.

The Solution: Embeddings in the Browser

AI Router turns every query into a vector embedding - basically a numerical represantation of its meaning. Then it compares that meaning against your routes using cosine similarity.

The cool part - the model runs entirely client-side in a Web Worker. Main thread stays free, zero server cost, and it works offline after the first model download (~22 MB, cached).

import { SmartRouter } from '@van1s1mys/ai-router';

const router = new SmartRouter({
  routes: [
    { path: '/pricing', title: "'Pricing', description: "'cost, plans, subscription' },\""
    { path: '/contact', title: "'Contact', description: "'support, phone, address' },\""
    { path: '/docs',    title: "'Docs',    description: "'documentation, API, guides' },\""
  ],
  threshold: 0.5,
});

await router.ready; // model loads once, cached after

const result = await router.search('how much does it cost?');
// { path: '/pricing', score: 0.87 }
Enter fullscreen mode Exit fullscreen mode

Thats it. Five lines of config and your search understands natural language.

It Even Handles Typos

Type "pricng" or "contcat" - the router still finds the right page. Embeddings capture meaning at a deeper level than characters, so misspellings dont break anything.

Try it yourself in the live demo - intentionally make typos and watch it still work.

How It Works Under the Hood

User query
    |
    v
[Web Worker] --> HuggingFace Transformers --> embedding vector
    |
    v
[Orama] --> hybrid search (text + vector) --> ranked results
    |
    v
Best match above threshold --> { path, score }
Enter fullscreen mode Exit fullscreen mode
  1. SmartRouter spawns a Web Worker on init
  2. The worker loads an ONNX model via @huggingface/transformers
  3. Your routes get indexed in Orama - both as text and as vectors
  4. On search(), the query gets embedded and compared against all routes
  5. Best match above threshold is returned

Everything runs in the worker thread. Main thread only sends and recieves messages.

Progressive Model Loading

Start fast, get better in the background:

const router = new SmartRouter({
  routes,
  model: ['Xenova/all-MiniLM-L6-v2', 'Xenova/multilingual-e5-small'],
  onModelUpgrade: (modelId) => console.log(`Upgraded to ${modelId}`),
});

await router.ready; // first model ready - search works immediately
// second model loads in the background, seamlessly upgrades
Enter fullscreen mode Exit fullscreen mode

First model is small and fast. Second one is more accurate. Users get instant results while the better model loads behind the scenes.

Framework Plugins

Dont want to list routes manually? Plugins auto-scan your pages directory:

# Vite / SvelteKit / Vue
npm install @van1s1mys/ai-router-plugin-vite

# Next.js
npm install @van1s1mys/ai-router-plugin-next

# Webpack / CRA
npm install @van1s1mys/ai-router-plugin-webpack
Enter fullscreen mode Exit fullscreen mode
// vite.config.ts
import { aiRouter } from '@van1s1mys/ai-router-plugin-vite';

export default defineConfig({
  plugins: [aiRouter()],
});
Enter fullscreen mode Exit fullscreen mode

Routes are generated at build time from your file structure. Zero config.

SSR? Covered

AI Router is SSR-safe out of the box. On the server ready resolves immediately and search() returns null. Web Worker only spawns in the browser. Works with Next.js, Nuxt, Remix - whatever you use.

The Stack

  • @huggingface/transformers - model inference in the browser
  • Orama - hybrid text + vector search
  • Web Workers - off-main-thread execution
  • ONNX Runtime - model execution engine
  • TypeScript - full type safety
  • tsup - bundling

Try It

npm install @van1s1mys/ai-router
Enter fullscreen mode Exit fullscreen mode

Live Demo | GitHub | npm | Docs

If you ever wished your SPA search was smarter - give it a shot. Stars on GitHub are always welcome, and id love to hear feedback or feature ideas in the issues.


Whats your approach to search in SPAs? Ever tried client-side AI? Let me know in the comments

Top comments (0)