DEV Community

ryan pedram
ryan pedram

Posted on

I built a forensic ATS scanner in 96 hours using LLMs as my backend team. Here is the stack.

I’m 19. No team. No VC funding. Just me and a laptop.

Last week reality hit me hard. I applied to 127 jobs and got 0 interviews. I know I'm a solid dev, so I figured something was broken technically.

I dug into the parsing logic of legacy ATS systems like Workday and Taleo. Turns out my "modern" resume was being read as total gibberish because I used columns.

I wanted to build a tool to fix this for everyone else. Usually building a full SaaS takes months. I gave myself 4 days.

Here is how I built InterviewGhost.us by acting as the Architect and using AI as my engineering team.

The Stack
The goal was speed without sacrificing technical rigor.

Frontend: Next.js + Tailwind. I iterated this via Claude 3.5 Sonnet using "Linear-style" design tokens.

Backend: Node.js + Puppeteer. Used this for the forensic PDF generation.

Logic: DeepSeek-V3. Used it to optimize the heavy text extraction logic.

Deployment: Vercel.

Database: None. I built a zero-retention architecture for privacy.

The Hardest Part: Forensic PDF Generation
Generating a PDF in Node.js usually sucks. You have to mess with pdf-lib or jspdf and fight with CSS print rules.

I didn't want to write every CSS class by hand. So I fed the LLM a strict design system. I told it to create a forensic report template that looks like a McKinsey audit document mixed with a terminal log.

It generated a grid-based layout that dynamically visualizes the "parsing scramble" effect.

The first version actually looked "too clean" and didn't show the error.

So I forced the parser into "dumb mode" to strip layout data. This exposed exactly how the robot sees scrambled text.

The Result: A 3-page forensic audit that shows you exactly what the robot sees when it reads your resume.

Page 1: The Score Gauge.

Page 2: The "Evidence Locker" (Raw, scrambled text).

Page 3: The Fix Plan.

Why "No-Code" wasn't enough
I could have used Bubble or Framer. But I needed raw processing power.

I needed to rip apart a PDF file byte-by-byte to find hidden header layers that break older parsers.

You can't do that with a drag-and-drop builder. You need Node.

I used AI to write the boilerplate and handle the edge cases. This let me focus entirely on the parsing logic.

The "Ghost" Protocol
I call it InterviewGhost because it’s about reverse-engineering the silence.

Most candidates think they are being rejected by humans. They aren't. They are being archived by a regex script from 2012.

I built this tool to prove it.

The outcome:

Cost to build: $0 (Time + existing API credits).

Time to launch: 4 days.

Value: It catches formatting errors that $50/month tools like Jobscan miss.

If you are a dev getting ghosted, stop tweaking your keywords. Check your parsing.

I put the tool up here: https://interviewghost.us

(P.S. I’m currently running this on a $10/day ad budget. Bootstrapping is alive and well.)

Top comments (0)