DEV Community

Cover image for I Built a Social Platform Where You Can Post in Hindi and Someone in Tokyo Reads It in Japanese — Here's How
Arjun
Arjun

Posted on

I Built a Social Platform Where You Can Post in Hindi and Someone in Tokyo Reads It in Japanese — Here's How

🌐 The Problem That Kept Me Up at Night

Picture this.

You discover a Reddit thread about traditional cooking techniques. The most insightful comment — bar none — is buried at the bottom. You almost miss it.

It's written in Portuguese.

The person clearly knows what they're talking about. They're detailed, nuanced, passionate. But only 3% of Reddit users will ever read it. The other 97% scrolled past without even knowing what they missed.

That comment deserved 10,000 upvotes. It got 2.

This happens billions of times a day across every social platform on the internet.

Here's the uncomfortable truth: only 25.9% of internet users speak English as their primary language. Yet English dominates over 60% of online content. The global conversation we think we're having? It's actually a very exclusive English-speaking club.

I couldn't stop thinking about this. So I built EchoBoard — a social discussion platform where language is no longer a barrier to being heard.

Write in Hindi. Someone in Tokyo reads it in Japanese. A student in Paris sees it in French. A developer in São Paulo reads it in Portuguese.

Real-time. Zero friction. No Google Translate copy-paste.

Here's exactly how I built it — and what you can steal for your own projects.


💡 What You'll Build By Reading This

By the end of this post, you'll understand:

  • How to build a 3-API fallback translation chain that never fails
  • How to detect a user's language from their post content automatically
  • How to build real-time, language-aware feeds using React + Supabase
  • How to layer i18next UI localization on top of dynamic content translation
  • The architectural decisions I made (and the ones I regret)

🛠️ The Tech Stack

Frontend:    React 19 + Vite 7
Styling:     Tailwind CSS 4
Auth:        Clerk (Google + GitHub sign-in)
Database:    Supabase (PostgreSQL + Realtime)
Translation: MyMemory → LibreTranslate → Lingva (fallback chain)
i18n:        i18next + react-i18next
Icons:       Lucide React
Routing:     React Router v7
Deploy:      Vercel (free tier)
Enter fullscreen mode Exit fullscreen mode

The most critical architectural decision was the translation system. Let me explain why I chose three APIs instead of one — and why that turned out to be the best call I made.


🌍 The Translation Engine: Why a 3-API Fallback Chain?

When I started, I naively assumed one translation API would be enough. I was wrong.

Here's the reality of free/low-cost translation APIs:

API Free Tier Reliability Coverage
MyMemory 5000 chars/day High 70+ languages
LibreTranslate Self-hostable Medium 30+ languages
Lingva Unlimited (Google proxy) Medium 100+ languages

None of them are perfect alone. But chained together? Rock solid.

Here's the hook I built — useTranslationEngine.js:

// src/hooks/useTranslationEngine.js
import { useState, useCallback } from 'react';

const APIS = [
  {
    name: 'MyMemory',
    translate: async (text, sourceLang, targetLang) => {
      const url = `https://api.mymemory.translated.net/get?q=${encodeURIComponent(text)}&langpair=${sourceLang}|${targetLang}`;
      const res = await fetch(url);
      const data = await res.json();

      if (data.responseStatus !== 200) throw new Error('MyMemory failed');
      return data.responseData.translatedText;
    }
  },
  {
    name: 'LibreTranslate',
    translate: async (text, sourceLang, targetLang) => {
      const res = await fetch('https://libretranslate.com/translate', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({
          q: text,
          source: sourceLang,
          target: targetLang,
          format: 'text'
        })
      });
      const data = await res.json();
      if (data.error) throw new Error('LibreTranslate failed');
      return data.translatedText;
    }
  },
  {
    name: 'Lingva',
    translate: async (text, sourceLang, targetLang) => {
      const url = `https://lingva.ml/api/v1/${sourceLang}/${targetLang}/${encodeURIComponent(text)}`;
      const res = await fetch(url);
      const data = await res.json();
      if (!data.translation) throw new Error('Lingva failed');
      return data.translation;
    }
  }
];

export function useTranslationEngine() {
  const [isTranslating, setIsTranslating] = useState(false);
  const [translationSource, setTranslationSource] = useState(null);

  const translate = useCallback(async (text, sourceLang, targetLang) => {
    // No translation needed if same language
    if (sourceLang === targetLang || !text?.trim()) return text;

    setIsTranslating(true);

    for (const api of APIS) {
      try {
        console.log(`[Translation] Trying ${api.name}...`);
        const result = await api.translate(text, sourceLang, targetLang);
        setTranslationSource(api.name);
        setIsTranslating(false);
        return result;
      } catch (err) {
        console.warn(`[Translation] ${api.name} failed, trying next...`, err.message);
      }
    }

    // All APIs failed — return original text with a flag
    setIsTranslating(false);
    return text;
  }, []);

  return { translate, isTranslating, translationSource };
}
Enter fullscreen mode Exit fullscreen mode

The magic is in the for loop. If MyMemory hits its daily limit, we silently fall through to LibreTranslate. If LibreTranslate is down, we hit Lingva. The user never sees an error. They just see translated content.

💡 Pro Tip: Log translationSource to your analytics. If you see Lingva firing too often, you might be hitting MyMemory's daily limit — a sign you need to upgrade or add a fourth API.


🗄️ The Database Schema That Makes It Work

The hardest architectural decision: where do you store translations?

I chose to store the original post and generate translations on-the-fly at read time, rather than pre-translating into every language. Here's why:

  1. Storage efficiency — You don't know which languages your users will need
  2. Accuracy — You always translate from the original, not from a previous translation
  3. Flexibility — Add a new UI language? No re-migration needed

Here's the core schema I ran in Supabase:

-- Posts table — stores original content only
CREATE TABLE posts (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  title_original TEXT,
  body_original TEXT,
  author_id TEXT REFERENCES profiles(id),
  community_id UUID REFERENCES communities(id),
  locale TEXT DEFAULT 'en',       -- The language the post was written in
  upvotes INTEGER DEFAULT 0,
  comment_count INTEGER DEFAULT 0,
  created_at TIMESTAMPTZ DEFAULT NOW()
);

-- Comments table — same pattern
CREATE TABLE comments (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  post_id UUID REFERENCES posts(id),
  author_id TEXT REFERENCES profiles(id),
  body_original TEXT,
  locale TEXT DEFAULT 'en',      -- The language the comment was written in
  created_at TIMESTAMPTZ DEFAULT NOW()
);
Enter fullscreen mode Exit fullscreen mode

The locale column on each post is the key. When user A posts in Hindi (hi) and user B is viewing in Japanese (ja), the feed component knows to call translate(post.body_original, 'hi', 'ja').


⚡ The Feed Component: Real-Time + Multilingual

Here's a simplified version of the feed post card that handles translation, language detection, and the "See Original" toggle:

// src/components/PostCard.jsx
import { useState, useEffect } from 'react';
import { useTranslation } from 'react-i18next';
import { useTranslationEngine } from '../hooks/useTranslationEngine';

export function PostCard({ post }) {
  const { i18n } = useTranslation();
  const { translate, isTranslating } = useTranslationEngine();

  const [displayTitle, setDisplayTitle] = useState(post.title_original);
  const [displayBody, setDisplayBody] = useState(post.body_original);
  const [showOriginal, setShowOriginal] = useState(false);
  const [isTranslated, setIsTranslated] = useState(false);

  const userLocale = i18n.language;       // e.g., 'ja'
  const postLocale = post.locale || 'en'; // e.g., 'hi'

  useEffect(() => {
    const needsTranslation = userLocale !== postLocale;

    if (!needsTranslation || showOriginal) {
      setDisplayTitle(post.title_original);
      setDisplayBody(post.body_original);
      setIsTranslated(false);
      return;
    }

    async function fetchTranslation() {
      const [translatedTitle, translatedBody] = await Promise.all([
        translate(post.title_original, postLocale, userLocale),
        translate(post.body_original, postLocale, userLocale),
      ]);

      setDisplayTitle(translatedTitle);
      setDisplayBody(translatedBody);
      setIsTranslated(true);
    }

    fetchTranslation();
  }, [userLocale, showOriginal, post]);

  return (
    <div className="bg-white dark:bg-gray-900 rounded-xl p-6 border border-gray-200 dark:border-gray-800">
      {isTranslating && (
        <div className="flex items-center gap-2 text-sm text-blue-500 mb-3">
          <div className="w-3 h-3 border-2 border-blue-500 border-t-transparent rounded-full animate-spin" />
          Translating...
        </div>
      )}

      <h2 className="text-xl font-bold text-gray-900 dark:text-white mb-2">
        {displayTitle}
      </h2>

      <p className="text-gray-600 dark:text-gray-300 leading-relaxed">
        {displayBody}
      </p>

      {isTranslated && (
        <button
          onClick={() => setShowOriginal(prev => !prev)}
          className="mt-3 text-sm text-blue-500 hover:underline flex items-center gap-1"
        >
          🌐 {showOriginal ? 'See Translation' : `See Original (${postLocale.toUpperCase()})`}
        </button>
      )}
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Notice Promise.all for translating title and body simultaneously. This cuts translation wait time roughly in half compared to doing them sequentially.

💡 Pro Tip: Memoize your translation results in a useRef or sessionStorage keyed by ${postId}-${sourceLang}-${targetLang}. On a busy feed, the same post could get translated multiple times as users scroll up and down — caching eliminates redundant API calls.


🌐 Two-Layer i18n: UI vs. Content

This is where most people get confused. EchoBoard has two completely separate translation systems:

Layer What it translates How
UI Layer Buttons, menus, labels, navigation i18next + JSON locale files
Content Layer Posts, comments, bios Translation API chain at runtime

The UI layer uses standard i18next. Here's the setup:

// src/i18n.js
import i18n from 'i18next';
import { initReactI18next } from 'react-i18next';
import LanguageDetector from 'i18next-browser-languagedetector';

// Import all 9 language files
import en from './locales/en.json';
import es from './locales/es.json';
import fr from './locales/fr.json';
import de from './locales/de.json';
import ja from './locales/ja.json';
import ko from './locales/ko.json';
import zh from './locales/zh.json';
import it from './locales/it.json';
import hi from './locales/hi.json';

i18n
  .use(LanguageDetector)
  .use(initReactI18next)
  .init({
    resources: { en, es, fr, de, ja, ko, zh, it, hi },
    fallbackLng: 'en',
    interpolation: { escapeValue: false },
    detection: {
      order: ['localStorage', 'navigator'],
      caches: ['localStorage'],
    },
  });

export default i18n;
Enter fullscreen mode Exit fullscreen mode

When a user switches from English to Japanese, i18n.changeLanguage('ja') fires two cascading effects:

  1. All UI strings (nav, buttons, labels) immediately render in Japanese via i18next
  2. All feed posts re-trigger their useEffect and fetch new translations from the API chain

The result feels instant because the UI switches immediately, while content translation loads progressively.


🔥 The Trending System

Trending isn't just "most upvotes." Time-decay matters. A post from a week ago with 500 upvotes shouldn't outrank a post from this morning with 50.

Here's the Supabase query with a simple time-weight:

// src/lib/trending.js
export async function fetchTrendingPosts(filter = 'week', supabase) {
  const timeFilters = {
    today: new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString(),
    week:  new Date(Date.now() - 7  * 24 * 60 * 60 * 1000).toISOString(),
    all:   null,
  };

  let query = supabase
    .from('posts')
    .select('*, profiles(username, avatar_url), communities(name)')
    .order('upvotes', { ascending: false })
    .limit(25);

  const since = timeFilters[filter];
  if (since) {
    query = query.gte('created_at', since);
  }

  const { data, error } = await query;
  if (error) throw error;
  return data;
}
Enter fullscreen mode Exit fullscreen mode

The filter tabs (Today / This Week / All Time) just change the filter parameter. Clean, simple, effective.


🏘️ Communities with Real-Time Subscriptions

Supabase Realtime is genuinely one of the most underrated features in the modern stack. When someone posts in a community you've joined, you get a notification — in real time — with zero polling.

// src/hooks/useCommunityRealtime.js
import { useEffect } from 'react';
import { supabase } from '../lib/supabase';

export function useCommunityRealtime(communityId, onNewPost) {
  useEffect(() => {
    if (!communityId) return;

    const channel = supabase
      .channel(`community:${communityId}`)
      .on(
        'postgres_changes',
        {
          event: 'INSERT',
          schema: 'public',
          table: 'posts',
          filter: `community_id=eq.${communityId}`
        },
        (payload) => {
          console.log('[Realtime] New post in community:', payload.new);
          onNewPost(payload.new);
        }
      )
      .subscribe();

    return () => {
      supabase.removeChannel(channel);
    };
  }, [communityId, onNewPost]);
}
Enter fullscreen mode Exit fullscreen mode

When this fires, we add a "New posts available — click to refresh" banner at the top of the feed, just like Reddit. No jarring full-page refreshes.


🔔 The Notification System

Notifications track three events: upvotes, comments, and new community posts. Here's the schema:

CREATE TABLE notifications (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  user_id TEXT NOT NULL,
  type TEXT NOT NULL,          -- 'upvote' | 'comment' | 'community_post'
  post_id UUID REFERENCES posts(id),
  actor_id TEXT,               -- who triggered it
  read BOOLEAN DEFAULT FALSE,
  created_at TIMESTAMPTZ DEFAULT NOW()
);
Enter fullscreen mode Exit fullscreen mode

A Supabase database trigger inserts a notification row whenever a vote is added or a comment is created. The frontend subscribes to notifications table changes filtered by user_id, updating the badge count in real time.

// Simplified notification subscription
const channel = supabase
  .channel('my-notifications')
  .on(
    'postgres_changes',
    {
      event: 'INSERT',
      schema: 'public',
      table: 'notifications',
      filter: `user_id=eq.${currentUserId}`
    },
    (payload) => {
      setUnreadCount(prev => prev + 1);
      setNotifications(prev => [payload.new, ...prev]);
    }
  )
  .subscribe();
Enter fullscreen mode Exit fullscreen mode

💡 Lessons Learned (The Hard Way)

1. Translate on read, not on write

I briefly considered pre-translating every post into all 9 languages on creation. At 9 languages × 3 API calls = 27 API calls per post. I backed out of that immediately. On-demand translation at read time is the right call.

2. Always store the original locale with the content

If you don't store locale: 'hi' on the post, you have no idea what language to translate from. You'd have to detect it on every read — expensive and often inaccurate for short text.

3. Rate limit translation calls aggressively

During testing I accidentally translated the same posts 400 times in an afternoon. Add a useRef cache. Seriously.

4. Dark mode first, light mode second

Building dark mode as an afterthought is painful. I committed to dark mode as the default from day one and added light mode as an opt-in toggle. Every CSS decision was cleaner because of this.

5. Mobile-first navigation changed everything

The desktop sidebar nav I designed first was beautiful. It was also completely unusable on mobile. The bottom tab bar I ended up building for mobile was so much cleaner that I actually prefer it to the desktop version now.


📁 Project Structure

Echo-Board/
├── src/
│   ├── hooks/
│   │   └── useTranslationEngine.js   ← The 3-API fallback chain
│   ├── lib/
│   │   └── supabase.js
│   ├── locales/                      ← 9 language JSON files
│   │   ├── en.json
│   │   ├── ja.json
│   │   └── ...
│   ├── App.jsx
│   ├── FeedPage.jsx                  ← Main multilingual feed
│   ├── TrendingPage.jsx
│   ├── CommunitiesPage.jsx
│   ├── NotificationsPage.jsx
│   ├── SavedPostsPage.jsx
│   └── i18n.js                      ← i18next config
├── vercel.json                       ← SPA routing fix
└── package.json
Enter fullscreen mode Exit fullscreen mode

🚀 Deploy to Vercel in 5 Minutes

1. Environment Variables

VITE_SUPABASE_URL=your_supabase_project_url
VITE_SUPABASE_ANON_KEY=your_supabase_anon_key
VITE_CLERK_PUBLISHABLE_KEY=your_clerk_publishable_key
Enter fullscreen mode Exit fullscreen mode

2. Fix SPA Routing

Create vercel.json at the root — without this, any route other than / returns 404 on refresh:

{
  "rewrites": [{ "source": "/(.*)", "destination": "/index.html" }]
}
Enter fullscreen mode Exit fullscreen mode

3. After Deploying

Add your Vercel domain (e.g., echo-board.vercel.app) to:

  • Clerk Dashboard → Allowed Origins
  • Supabase Dashboard → Auth → URL Configuration → Redirect URLs

That's it. Total deploy time: under 5 minutes.


📊 What This Unlocks (The Product Case)

Here's the thing nobody talks about when they build multilingual features: it's not just a UX improvement. It's a fundamental market expansion.

Metric English-only Multilingual
Addressable Users 25.9% of internet 99%+ of internet
Average Session Duration Baseline 2-3x longer (users feel at home)
Community Contribution Rate Baseline Significantly higher (people post in their native tongue)
Return Visits Baseline Higher (content feels personal)

A farmer in rural Maharashtra has ideas about sustainable agriculture that agricultural researchers in Amsterdam would genuinely benefit from. A poet in Kyoto has something to say that a student in Buenos Aires would feel deeply. They're just not saying it in English.

EchoBoard gives them a mic. And makes sure everyone can hear it.


🎯 What's Next

Here's what's on the roadmap:

  • Language detection on compose — detect what language you're writing in as you type and show a small flag indicator, so you know EchoBoard understood you
  • Translation quality feedback — a simple thumbs up/down on each translation that feeds back into API selection logic
  • Cached translations in Supabase — store popular post translations so the same content isn't re-translated on every view
  • Audio translation — post a voice note in Hindi, listeners hear it in their language

🔗 Links


🏁 Final Thoughts

The internet was supposed to connect everyone. But somewhere along the way, it became a mostly English-speaking club with a few translation buttons bolted on as an afterthought.

The next great idea might come from someone who has never written a tweet in English. The next breakthrough conversation in your industry might happen in Mandarin, or Arabic, or Swahili — and you'd never know, because it was never translated.

EchoBoard is a small bet that borderless conversation is possible. And based on what I've seen while building it, I think it's a bet worth making.

If you're building something multilingual, or you want to steal anything from this codebase — please do. That's what it's here for.

Drop a comment below if you have questions about the translation architecture, the Supabase setup, or anything else. I'll answer every one.


Built with React 19, Supabase, Clerk, Tailwind CSS 4, and a genuine belief that every voice deserves to be heard.

Tags: #react #javascript #webdev #tutorial #opensource #i18n #supabase #buildinpublic

Top comments (0)