Erlang, Vue, and client-side encryption — by design, not as an afterthought.
A close friend of mine works as an AIDS counsellor.
One day she told me something I couldn't shake:
"My clients want companionship. They want marriage. They want normalcy.
But they're terrified of being exposed."
She'd seen it over and over. People who were HIV-positive, stable on treatment, undetectable — but isolated. Dating apps didn't feel safe. Disclosure was complicated. A screenshot could cost someone their job. A database breach could shatter a life.
Then she asked me:
"Can you build something for them? But it has to be completely private."
Not "secure enough." Not "we encrypt passwords."
Completely private.
That's where this started — and where HIVPositiveMatches.com came from.
Why Traditional Dating Apps Are Dangerous for This Community
Most dating platforms are built on a silent assumption: the server is a trusted party.
- The server stores your profile
- The server reads your profile
- The server decides who sees your profile
- The server can be breached, subpoenaed, or misused
For most people, that's an acceptable tradeoff. For HIV-positive communities, it isn't. Disclosure can affect employment, housing, family relationships, and mental health. The fear isn't paranoia — it's grounded in real consequences that real people have faced.
Privacy here isn't a feature. It's psychological safety.
So I flipped the premise entirely:
What if the server never sees plaintext user data at all?
Not at rest. Not in logs. Not in admin dashboards. Not even temporarily.
That question led to a zero-knowledge architecture.
The Core Principle: The Server as a Blind Courier
Instead of a traditional CRUD backend that owns your data, the server becomes:
- A cryptographic relay
- A coordination layer
- A match facilitator
- Not a data owner
Everything meaningful happens on the client. The server passes sealed envelopes. It never opens them.
The Stack
| Layer | Technology |
|---|---|
| Backend | Erlang + Yaws |
| Frontend | Vue 3 + Naive UI |
| Auth | Google OAuth |
| Cryptography | TweetNaCl (client-side) |
| Key derivation | Handle + PIN (deterministic) |
| Encryption | Hybrid symmetric + asymmetric |
Each choice was made with one question in mind: does this require the server to see sensitive data? If yes, we found another way.
Step 1: Authentication vs. Identity
Google OAuth handles login and identity verification — nothing more.
Once authenticated:
- The email is encrypted client-side
- Only a hash of the encrypted email is stored for lookup
- The server sees: OAuth provider, user role, membership tier
That's it. No name. No email in plaintext. No profile data of any kind.
If the database leaks tomorrow, the attacker gets a list of meaningless hashes.
Step 2: Profiles Are Encrypted Before They Leave the Browser
When a user builds their profile, this is what happens in the browser:
- A symmetric AES key is generated locally
- The entire profile is encrypted with that key
- Only the encrypted blob — and the wrapped key — are sent to the server
What the database actually stores:
- Encrypted profile vault
- Encrypted card preview
- Wrapped encryption keys
- Public keys
- Search-safe hashed fields
The server never holds readable health data, lifestyle information, or personal details. Not because we deleted it — because we never had it.
Step 3: No Passwords — Deterministic Key Derivation
Traditional passwords are a liability. Reset flows, storage, hashing — every step is a potential leak.
Instead, users choose a handle and a PIN. These become salt inputs for key derivation, entirely in the browser. The PIN is never transmitted. Never stored. The server cannot brute-force vaults even under a court order.
The result: a login flow that feels simple, backed by cryptographic guarantees that don't depend on us doing the right thing.
Step 4: Public/Private Keys for Matching
Each user generates a key pair in the browser:
- Public key → sent to the server
- Private key → encrypted locally using the PIN, never leaves the device in plaintext
When two users match, vault keys are exchanged through a controlled wrapping flow:
- User A's vault key is wrapped with their public key
- The server re-wraps it with User B's public key
- User B's frontend unwraps it locally
Think of it like a courier passing a sealed envelope to a new recipient — resealing it without ever reading what's inside.
At no point does the server retain readable profile data from either user.
Step 5: Erlang for Reliability at Scale
Erlang wasn't chosen for novelty. It was chosen because this system demands:
- Massive concurrency — many users, many cryptographic operations
- Fault tolerance — processes fail safely and independently
- Predictable performance — no surprises under load
Yaws serves dynamic HTML with embedded session markers that Vue hydrates on the frontend. The result is a modern UI backed by a backend that is minimal, resilient, and has been battle-tested for decades in telecoms.
Step 6: AI Matching Without Exposing Interests
Compatibility can't be computed server-side if the server can't read profiles. So we moved it to the browser.
Each client:
- Generates embeddings locally from profile content
- Normalizes and binarizes them
- Sends only non-reversible binary hashes to the server
The server compares hashes using Hamming distance. It never sees raw interests, lifestyle traits, or "about me" text. AI-powered matching — without the AI ever knowing who it's matching.
The next article in this series goes deep on how that works.
Threat Model: What Happens When Everything Goes Wrong?
| Scenario | What the attacker gets |
|---|---|
| Database leak | Encrypted blobs. Unreadable. |
| Rogue admin | Nothing. No plaintext to access. |
| Server compromise | No sensitive data to steal. |
| Network interception | TLS + client-side encryption. |
| Legal pressure / subpoena | Server literally cannot decrypt anything. |
This isn't security through obscurity. The server cannot betray users — not because we promise it won't, but because it architecturally cannot.
That distinction matters. Promises can be broken. Math can't.
What Building This Taught Me
Most systems are designed around: "How do we store and use data?"
This one is built around: "How do we avoid ever possessing data?"
That shift changes everything — UX, backend design, matching logic, error handling, recovery flows. It forces a kind of humility: assume the server is a liability, not a guardian. Design accordingly.
It also changes what it means to be accountable. I cannot read my users' profiles. That's not a limitation — it's the point.
What's Coming Next
This post is the foundation. The series continues with:
- Part 2: Zero-knowledge filtering using 32-bit bitmasks (next)
- Part 3: AI embeddings + Hamming distance matching
- Part 4: Key-wrapping flows in practice
- Part 5: Account recovery without exposing secrets
- Part 6: UX patterns for client-side encryption
This project started because a counsellor wanted her clients to feel safe finding love.
It became one of the most technically challenging — and meaningful — systems I've ever built.
For sensitive communities, privacy-first design isn't optional. It's the only ethical choice.
Top comments (0)