Every year, our team visits engineering colleges across India to hire freshers. The first round is always an online coding test — 300+ students, one shot at finding the ones who can actually think.
We tried Coderbyte. Fifty concurrent user limit. So we'd split students into batches, stagger timings, juggle schedules between college coordinators and our engineers.
We tried HackerRank's community edition. Different tool, different headache.
Every vendor had a ceiling — concurrency limits, inflexible problem formats, generic DSA questions that tested memorization over problem-solving. And the pricing? Designed for companies ten times our size.
I was ranting about this to my engineering team. Out loud. In our standup. Trying to find yet another vendor to evaluate.
My engineers — most of them freshers themselves just a couple years ago — went quiet. Said nothing for a few days.
Then they shipped a product. Two engineers. One weekend. AI-assisted development. And two days of intensive testing before it went live.
What They Built
A full-stack, self-hosted coding exam platform. Not a toy. Not a prototype. A production system we ran 300+ students through this hiring season.
Here's what's under the hood:
Monaco Editor — the same engine that powers VS Code. Syntax highlighting, autocomplete, multi-language support. Students write real code, not paste answers into a textarea.
Judge0 Sandboxed Execution — every submission runs inside a sandboxed Judge0 instance. Test cases execute in parallel with automatic batching. Students get instant, per-test-case verdicts.
ICPC-Style Scoring — not just pass/fail. Penalty points for wrong attempts. Time-based ranking. Race-condition-safe writes to the database. The leaderboard feels like a competitive programming contest, not a homework checker.
Live Leaderboard — backed by a PostgreSQL materialized view that refreshes after every accepted submission. O(1) rank queries. Students watch themselves climb in real-time.
API-Based Challenges — beyond traditional stdin/stdout problems, we built support for API-format challenges where students interact with real endpoints. This lets us test how candidates think about integration, not just algorithms.
Server-Synced Timer — the countdown runs on server time, not the client clock. No inspect-element tricks. Configurable start/end windows with server-enforced access guards.
Autosave — code drafts are debounce-saved to the server every few seconds. Browser crash? Tab closed? The student picks up right where they left off.
White-Label Ready — app name, logo, brand colors, copyright — all configurable via environment variables. Zero code changes. We use it as our own branded platform; anyone can make it theirs.
Architecture at a Glance
The platform is a monorepo with two core applications:
client/ → Vue 3 SPA (student exam UI + admin panel)
server/ → NestJS REST API (auth, exam logic, code execution, scoring)
In production, the server compiles and serves the client's static build directly — no separate web server or CDN needed.
The submission flow works like this:
- Student writes code in the Monaco editor and hits Submit
- The Vue client POSTs to the API with the code and language
- The SubmissionsService fetches all test cases and sends batch requests to Judge0, automatically chunking to stay within limits
- The server polls Judge0 tokens until all results resolve
- The ScoringService applies the ICPC penalty formula and updates the score using a pessimistic database lock
- The LeaderboardService refreshes the materialized view
- Results return to the client with per-test-case verdicts and an updated leaderboard
All of this happens in seconds, even under load.
The Tech Stack
Frontend: Vue 3 (Composition API), Vite 8, TypeScript 5.9, Pinia 3 for state, Monaco Editor 0.55, Brotli compression
Backend: NestJS 11, TypeScript 5.7, TypeORM 0.3, Passport JWT, Swagger/OpenAPI docs, rate limiting via @nestjs/throttler
Database: PostgreSQL 17 for the application, PostgreSQL 16 + Redis 7.2 for Judge0's internal queue
Infrastructure: Docker Compose orchestrates six services — app, app-db, judge0-server, judge0-worker, judge0-db, and judge0-redis. Multi-stage Dockerfile produces a minimal Node 22-alpine image running as a non-root user.
Features That Matter
Here's what we built because we needed it, not because a product manager spec'd it:
- Multiple concurrent exams — run several exams at once; students pick which to enter
- Mixed formats — MCQs alongside coding problems in the same exam
- Admin panel — create exams, duplicate them, manage problems with visible/hidden test cases, configure weights
- Safe Exam Browser detection — a composable detects whether students are in a locked-down browser
- Built-in API docs — interactive API reference baked right into the student UI for API-format challenges
- QA role opt-in — students can flag interest in QA engineering during registration
- Run mode — execute code against sample inputs without scoring; lets students experiment before committing
Why Open Source?
We're a fintech startup. Thirty-odd people. We didn't build this to sell it.
We built it because we were tired of bending our hiring process around someone else's product limitations. And once we had it, we realized every small company visiting colleges faces the exact same problem.
Here's the thing that makes this story worth telling: two engineers built this in a weekend, with AI doing the heavy lifting on scaffolding, boilerplate, and iteration. Then two days of intensive testing to harden it for production. That's the power of AI-assisted development — it doesn't replace engineers, it turns two of them into ten.
In the AI era, expensive hiring software shouldn't be a gate that keeps small teams from finding great talent. If two engineers with AI tools can build a platform that handles 300+ concurrent students with ICPC scoring and sandboxed execution in a weekend, there's no reason that capability should be locked behind enterprise pricing.
The whole thing is AGPL-3.0 licensed. Fork it, brand it, run it on your own infrastructure — just keep your modifications open too.
Getting Started
The fastest path is Docker Compose:
cp .env.example .env
# Set DB_PASSWORD, JWT_SECRET, ADMIN_SETUP_KEY
docker compose up --build
Six services start in dependency order. The app waits for the database health check, runs migrations automatically, and you're live.
For local development without Docker, you'll need PostgreSQL 17 and a Judge0 instance. The README walks through every step — database creation, migrations, environment variables, and running the frontend and backend separately.
What's Next
We're cleaning up a few things before the public launch:
- Finishing the test suite (Jest is installed and configured, specs are being added)
- Polishing the contributor docs
- Adding a demo mode so people can try it without setting up Judge0
If you're interested, follow me here — I'll drop the GitHub link as soon as the repo goes public.
The Bigger Lesson
I went looking for a vendor. My team handed me a product.
Two engineers. One weekend of building. Two days of intensive testing. Powered by AI-assisted development. A platform that replaced two commercial tools and produced measurably better candidate quality in round two.
That's what happens when you hire people for intent over resumes — and then get out of their way.
Built with Vue 3, NestJS, PostgreSQL, Judge0, and a healthy disregard for vendor lock-in.
Star the repo when it drops. Or better yet — fork it and run your own hiring season on it.


Top comments (1)
Repo please when ready.