DEV Community

Rahul Nagarwal
Rahul Nagarwal

Posted on

Stop the Scroll: Build a "Psychic" Client-Side Recommender ๐Ÿš€

Weโ€™ve all been there: staring at a dropdown with 50+ options, wishing the app would just know what we want. Usually, we solve this with a heavy backend search API. But what if you could build a lightning-fast, "psychic" recommendation engine entirely on the client side?

Hereโ€™s how to build a smart template recommender that anticipates user needs with zero latency.


The Core Logic: Context is King ๐Ÿ‘‘

Most search bars look at the whole text. To make recommendations feel intentional, we focus only on the intent. We extract the first line or sentence the user types, strip out the "noise" (stop words like the, a, with), and break it into tokens.

If a user types: "Create a new React component for the login page," our engine sees: ["Create", "React", "component", "login"].

The Brain: The Bitap Algorithm ๐Ÿง 

To handle typos and partial matches, we use Fuse.js. Under the hood, it utilizes the Bitap algorithm, which uses bitmasking to find matches within a specific "fuzziness" threshold. It treats text not just as strings, but as bit patterns, allowing it to be incredibly fast for client-side operations.

Ranking: Hits vs. Quality

A single "perfect match" isn't always the best result. We use a two-tiered scoring system to rank templates:

  1. Hit Count (Quantity): How many search tokens matched the template name?
  2. Average Score (Quality): How "fuzzy" were those matches?

The Scoring Formula

We calculate the relevance of a template using:
Average Score = {Sum{Bitap Scores}}/{Hit Count}

The Ranking Priority:

  1. Higher Hit Count always wins (matching "React" and "Component" is better than just matching "React" perfectly).
  2. Lower Average Score acts as the tie-breaker for quality.

Clean TypeScript Implementation

import Fuse from 'fuse.js';

export const getRecommendations = (input: string, list: any[]) => {
  const tokens = input.split(/[\n.!?;]/)[0].toLowerCase()
    .split(/\s+/).filter(w => w.length > 2);

  const fuse = new Fuse(list, { 
    keys: ['name'], 
    threshold: 0.45, // Bitap fuzziness threshold
    includeScore: true 
  });

  const scoreMap = new Map();

  tokens.forEach(token => {
    fuse.search(token).forEach(({ item, score }) => {
      const entry = scoreMap.get(item.id) || { item, totalScore: 0, hitCount: 0 };
      entry.totalScore += (score ?? 1);
      entry.hitCount += 1;
      scoreMap.set(item.id, entry);
    });
  });

  return Array.from(scoreMap.values())
    .map(m => ({ ...m, avg: m.totalScore / m.hitCount }))
    .filter(m => m.avg < 0.4)
    .sort((a, b) => (b.hitCount - a.hitCount) || (a.avg - b.avg))
    .slice(0, 5).map(m => m.item);
};
Enter fullscreen mode Exit fullscreen mode

Why This Works

  • Zero Latency: No API calls means the UI updates as fast as the user types.
  • Deduplication: Move the top 5 matches to a "Recommended" section and hide them from the main list to keep the UI clean.
  • Privacy: No user data ever leaves the browser.

By combining Bitap-powered fuzzy matching with a "hit-heavy" ranking logic, you create a UX that feels less like a tool and more like an assistant.

Top comments (0)