DEV Community

Rajae
Rajae

Posted on

Building C24 Club: The Architecture Behind a Rewards-Powered Video Chat Platform!

Random video chat looks simple from the outside!

click a button, meet a stranger. But the moment you add a live rewards economy, gender-aware matchmaking, AI moderation, and a global concurrent user base**, the engineering challenge becomes very different from a basic Omegle clone.

In this post, we walk through the high-level architecture powering C24 Club a browser-based video chat platform where users earn real, redeemable rewards for every minute they spend connecting with strangers.

c24club logo

What Is C24 Club?

C24 Club is a browser-based random video chat platform with one major twist: every minute you talk earns reward minutes that can be redeemed for gift cards, designer products, and PayPal cash.

The platform is built around four pillars:

⚡ Sub-second matchmaking during scheduled call windows
🔒 End-to-end encrypted P2P video (DTLS-SRTP, no server in the media path)
💰 A real-time rewards economy with atomic earning, gifting, and cash-out
🛡️ Client-side AI moderation that protects users without storing video

High-Level Architecture

C24 Club follows a serverless-first, edge-native architecture. Instead of running long-lived Node servers, almost every backend operation is a Supabase Edge Function (Deno) deployed close to the user.

Core Design Principles

  • Serverless edge compute no idle infrastructure costs
  • Database as the source of truth Postgres + RLS replaces a custom auth layer
  • Polling over WebSockets for matchmaking reliability (more on this below)
  • Peer-to-Peer (P2P) media backend never touches video or audio
  • Atomic database functions for any value-changing operation (earnings, gifts, cash-out)

Tech Stack Breakdown

1️⃣ Frontend: React + Vite, Built for Speed

The entire user-facing app is a single-page React application optimized for the video call hot path.

  • React 18 + Vite 5 + TypeScript 5 instant HMR in dev, tiny production bundles
  • Tailwind CSS + shadcn/ui semantic design tokens (HSL-based) drive consistent theming across light/dark
  • TanStack Query caching layer that keeps Discover feeds, leaderboards, and minute balances fresh without hammering the database
  • React Router preserves the active WebRTC connection across navigation to Reward Store, Profile, and Discover (a critical UX requirement — leaving the page would drop the call)

Client-Side Intelligence

We do as much work in the browser as possible to protect privacy and reduce server cost:

  • NSFWJS runs locally on remote video frames with a 0.75 nudity threshold to flag policy violations no video ever leaves the peer connection.
  • Black-screen detection runs locally to prevent users from gaming the rewards system.
  • Pre-blur shield hides the partner feed for the first second of every match, preventing flash exposures.

2️⃣ Backend: Lightweight Edge Signaling

Our backend never touches a single video frame. Its job is coordination, persistence, and money.

Supabase Edge Functions (Deno)

We have ~70 edge functions handling everything from videocall-match (the matchmaking brain) to gift-minutes (Stripe-backed gifting) to nsfw-ban (automatic enforcement). Each function:

  • Boots cold in <100 ms
  • Reads only the secrets it needs
  • Authenticates via Supabase JWT
  • Returns in well under 1 second for the user-facing hot paths.

PostgreSQL + Row-Level Security

Postgres is the single source of truth. RLS policies enforce per-user access at the database layer, so the frontend can query directly without a custom API server in front.

Sensitive operations incrementing minutes, claiming rewards, processing wagers are wrapped in atomic SQL functions (atomic_increment_minutes, redeem_reward_atomic) to prevent race conditions and double-spends.

Storage & Auth

  • Supabase Storage for profile selfies, gift card images, and challenge proof screenshots (all behind RLS)
  • Supabase Auth with Google, Apple, and email/password — JWT sessions shared across the app

3️⃣ Real-Time Layer: Why We Chose Polling Over WebSockets

This is the most counter-intuitive decision in our stack. We originally built matchmaking on Supabase Realtime (WebSockets), but at concurrency we hit reliability issues silent disconnects, missed events, ghost matches.

4️⃣ Video: Pure P2P with WebRTC

Once two users are matched, the backend steps out of the way entirely.

  • WebRTC offer/answer exchanged through a single Edge Function (videocall-match)
  • ICE candidates trickled through the same channel
  • Mandatory DTLS-SRTP encryption neither C24 Club nor any ISP can decrypt the stream
  • Specific role enforcement: in random match the queue-joiner is always Offerer, the queue-finder is always Answerer (this single rule eliminated the vast majority of our connection failures)

How a Match Happens (End-to-End)

Here is what actually happens when a user clicks START:

1️⃣ Pre-flight (client)

  • Selfie gate verifies the user has uploaded a Discover photo
  • Camera permission is requested
  • Local NSFW model and black-screen detector spin up

2️⃣ Queue join (edge)

  • videocall-match inserts the user into the matchmaking queue with their gender and preferences
  • Atomic DB function returns the partner row if a complementary peer is already waiting

3️⃣ Signaling (edge)

  • Both clients receive each other's user IDs
  • Offerer creates SDP offer → POSTed to Edge Function → delivered to Answerer via short-poll
  • Answer + ICE candidates flow back through the same channel

4️⃣ P2P established

  • WebRTC connection upgrades to direct browser-to-browser
  • Backend has zero involvement in audio/video from this point
  • Local moderation models begin scanning the remote stream

5️⃣ Earning loop

  • Every full minute, client calls atomic_increment_minutes RPC
  • Server-side function validates session integrity and awards minutes (with female bonus rates, VIP multipliers, Power Hour boosts, and freeze-state checks all applied atomically)

6️⃣ Disconnect & re-queue

  • User clicks NEXT → P2P torn down → user re-enters queue in <500 ms
  • Skip-penalty rule deducts 2 minutes if a non-VIP user skips inside 5 seconds (prevents farming).

The Rewards Economy

This is the part most random-chat clones don't have, and it's where the architecture gets interesting.

  • Calling minutes vs. gifted minutes decoupled balances so receiving a gift can't be cashed out without earning your own minutes too
  • Gift checkout via Stripe opens in a new tab so the live call stays connected
  • Atomic redemption redeem_reward_atomic deducts minutes and creates the redemption row in a single transaction; out-of-stock items trigger automatic refunds
  • Anchor program female users earn cash payouts on top of minutes, settled via PayPal through anchor-earning and cashout-minutes
  • Lucky Spin & Wage micro-games funded by a configurable daily cap, with prize weighting handled by a Postgres function for verifiable fairness.

Cost Optimization at Edge Scale

Running serverless can get expensive fast if you're not careful. Two architectural choices keep our cloud bill flat as we grow:

  • Polling intervals are tuned per surface, not global DM badges every 10 s is fine; matchmaking needs 1 s
  • Edge Functions for hot writes only read-heavy queries (Discover feed, leaderboard) hit Postgres directly with RLS, bypassing the function layer entirely

Privacy & Safety by Design

  • No video storage. Ever. Streams are P2P-encrypted; we couldn't store them if we tried
  • Client-side moderation NSFWJS runs in the user's browser, only flags (not frames) reach the server
  • IP-based ban enforcement at the database level bans take effect on the next request
  • CSAE compliance dedicated reporting flow, automatic underage ban with appeal path

Final Thoughts

C24 Club's architecture is the product of a deliberate trade-off: boring, reliable infrastructure for the parts that matter (money, matchmaking, moderation) combined with bleeding-edge browser tech for the parts users feel (P2P video, instant matching, on-device AI).

If you're building a real-time platform with money attached gifts, payouts, in-app economies the lessons we'd repeat are:

  • Use atomic database functions for every value-changing path
  • Pick polling over realtime sockets when reliability beats latency
  • Never put your backend in the media path
  • Run moderation on the client and trust the database, not the network

👉 Try the platform: c24club

💬 Questions about the stack? Drop them in the comments we love nerding out about edge-native architecture.

Top comments (0)