DEV Community

SRI VILLIAM SAI
SRI VILLIAM SAI

Posted on

How I Built an Auto-Generating Resume System with Node.js Microservices and Kafka

Have you ever forgotten to update your resume after completing a course or finishing an internship? I have, many times. So I built a system that does it automatically.

Resume Ecosystem is an open source project that auto-generates a verified, living resume from your real achievements. Every internship, course, hackathon, or project you log gets streamed through Kafka, verified, scored, and reflected in your resume without you touching a PDF.

GitHub: https://github.com/srivilliamsai/resume-ecosystem-node

The Problem With Static Resumes

Most developers maintain a Word document or Canva template they update once every few months. The result is always outdated, always missing something, and always a pain to format.

I wanted a system where your resume updates itself as you grow. Log an achievement, get it verified, watch your resume rebuild automatically.

Architecture Overview

The system is built as 8 microservices communicating through Kafka topics:

API Gateway handles all incoming requests and JWT verification on port 4000

Auth Service manages registration, login, and token issuance on port 4010

Activity Service handles CRUD for achievements with Jaccard similarity deduplication on port 4020

Verification Service verifies achievement hashes with an LRU cache on port 4030

Resume Service rebuilds, ranks, and versions your resume on port 4040

Integration Service ingests webhooks from external platforms on port 4050

Notification Service fans out resume events via email and WebSocket on port 4060

File Service renders the final PDF resume using pdfkit on port 4070

The event flow looks like this:

User logs activity → activity.created fires on Kafka → Verification service picks it up → activity.verified fires → Resume service rebuilds → resume.version.published fires → Notification service alerts the user

Key Technical Decisions

I chose Kafka over direct REST calls between services because it keeps everything decoupled. The resume service does not care who verified an activity or when. It just listens for the verified event and rebuilds.

I used Fastify instead of Express because of its schema based validation and significantly better performance on benchmarks. Every route has a defined request and response schema.

Prisma was the right ORM choice here because the monorepo has multiple services each with their own schema but sharing the same PostgreSQL instance. Prisma handles the per-service client generation cleanly.

Redis handles two things: caching verification lookups in the verification service and rate limiting at the API gateway level.

How Verification Works

When you submit an activity with a certificate URL or credential hash, the verification service checks it against trusted issuers. Results are cached in an LRU cache so repeated lookups are instant. Once verified, the Kafka event fires and your resume rebuilds within seconds.

Resume Scoring Algorithm

Activities are not treated equally. The resume service assigns impact scores based on activity type, issuer reputation, recency, and verification status. Internships at known companies score higher than self-reported projects. Verified credentials score higher than unverified ones. The final resume score is a weighted average that gives recruiters a quick signal.

Tech Stack

Node.js 20 with TypeScript 5
Fastify for all HTTP services
KafkaJS for event streaming
Prisma ORM with PostgreSQL
Redis 7 for caching
React 18 with Vite and TailwindCSS
Zustand for frontend state
pdfkit for PDF generation
Docker Compose for local dev
Kubernetes manifests for deployment

What I Learned

Building event driven systems is harder than it looks. The biggest challenge was not Kafka itself but handling failures gracefully. What happens if the verification service is down when an activity.created event fires? You need retry logic, dead letter queues, and idempotent consumers.

I also learned that monorepo tooling with npm workspaces is surprisingly good. Sharing TypeScript configs and common utilities across 8 services without a build tool like Nx or Turborepo is very manageable for a project this size.

What Is Next

OAuth integration with LinkedIn and GitHub to auto-import activities
Swagger UI at /docs for interactive API documentation
Full test coverage with Jest and Supertest
Puppeteer based HTML to PDF for better resume templates

Try It Yourself

Clone the repo and run the demo in 60 seconds:

git clone https://github.com/srivilliamsai/resume-ecosystem-node
cd resume-ecosystem-node
npm install
npm run docker:up
npm run db:push
npm run seed
npm run dev

Then open http://localhost:5173

If you find this useful or have architecture feedback, a star on GitHub goes a long way. Contributions and issues are very welcome!

https://github.com/srivilliamsai/resume-ecosystem-node

Top comments (0)