Peter Grafe, CEO of BlueAlpha (small marketing shop, you wouldn't know them, that's the point), got 200 applications in two days for one role. 95% disqualified before a human opened the PDF ๐ฌ. The 10 that survived had to vibe code something in five days. You, meanwhile, are polishing your LinkedIn opening line for the third time.
If you can't vibe code, you just became unemployable.
TLDR: The resume is cooked. Recruiters get 200 plausible applications in 48 hours, they stopped pretending to read them, and the filter moved somewhere else. You probably don't know where.
When anybody can generate a plausible application in five minutes, the resume stops being a signal. It becomes noise. And noise gets filtered without a read.
You didn't lose those interviews. You didn't even play.
What follows is what the 10 do differently.
200 Applications in Two Days. The 10 That Survived Were Vibe Coders.
Grafe published the math himself in a Sherwood News piece this month. 200 applications, 48 hours, 95% out before any reading happened. He's not bragging, he's tired. He says he stopped opening the PDFs because the AI-generated cover letters all sound like a LinkedIn HR consultant on his fourth coffee.
The 10 candidates who got past the filter didn't have a better CV. They had something the other 190 didn't. They could ship.
Grafe gave them a brief, five days, and a vague problem. The ones who survived produced a working prototype. The ones who didn't, didn't.
That math is the job market itself.
When Faking a Signal Becomes Free, the Signal Stops Working
A signal that costs nothing to fake stops being a signal. That's why your inbox is full of 5-star Amazon reviews you don't trust, LinkedIn skill badges nobody verifies, and certificates that mean nothing because the website handed them out for completing a 12-minute video.
Resumes were already a weak signal a decade ago. Hiring managers admitted as much. They kept reading them because no cheaper alternative existed. The cost of writing a CV was high enough to discourage random people, low enough that you got real candidates. That equilibrium held for fifty years.
The AI didn't kill the resume. The resume was already wounded. AI just made the cost of producing one drop to zero, and the equilibrium snapped.
What AI fabricates for free, the market stops valuing.
The Resume Is Done. The Numbers Already Buried It.
TestGorilla surveyed 2,160 employers and candidates across the US and UK this year. 85% of employers now use skills-based hiring. That number was 81% last year and 73% the year before. The trend isn't subtle.
Same survey: 71% of employers say skills tests predict performance better than resumes. 86% of US hiring managers and 89% of UK hiring managers report problems with CVs. One in three recruiters admits they can't tell if the resume in front of them is accurate. They're not even pretending anymore.
Half the employers in the survey have dropped degree requirements. Two thirds say their AI-generated cover letter detector is busy. (It's not very good. It just runs all the time.)
So what do they trust? They trust what you can build in front of them with constraints, time pressure, and a brief. Grafe put it bluntly in the same Sherwood piece: "the bar has shifted from do you understand technology to can you produce something with it."
The resume was fakeable long before AI. AI just dropped the price to zero, and the filter cracked.
Vibe Coding Is the New Word. It Took 18 Months Instead of 10 Years.
Remember when knowing Word was a silent prerequisite for any office job? Nobody put "Microsoft Word, intermediate" on a resume because it was assumed. The shift took 10 years. From "secretaries use it" to "if you can't, the door is on your left."
Vibe coding is doing the same thing in 18 months. And no, this isn't just a tech industry move.
Harlem Capital, a venture fund, published their interview process. A senior associate candidate had to build an AI agent that automates industry research in a week, then brief the partners with the output. Another candidate had to vibe code a portfolio dashboard. Their head of talent Nicole DeTommaso wrote it in plain English: "You are not told which tools to use or how to go about it. You are just expected to figure it out."
Crux Analytics, an analytics firm, embeds a practical AI project in every single hire, technical or not. CEO Jacob Bennett told Sherwood the test is about "where did they use AI, where didn't they, and why". The code itself isn't the point.
BlueAlpha, the marketing agency from the top of this article, makes commercial candidates fire up Claude Code during the interview itself.
Look at that list. A VC fund, an analytics firm, a marketing agency. None of them is hiring developers.
Word took 10 years because companies pushed the tool. Vibe coding takes 18 months because the interview is the push. You don't get hired and then learn it. You learn it before you walk in, or you don't walk in at all.
Call it an audition, because that's what it is. Vague brief, five days, someone watches what comes out.
The Cheating Trick You Spent a Year Learning Just Became the Test.
Resume Genius surveyed 1,000 active US job seekers this year. 22% admit using AI in real time during interviews. 78% use AI somewhere in the job hunt. There's a YouTube short selling the trick that's doing absurdly well for its channel, the kind of outlier you don't get without a real candidate appetite. The title sells the secret. The audience confirms it.
A whole industry sprung up to sell stealth AI overlays. Fake browser tabs, transparent windows, fancy keyboard shortcuts, the works. Candidates spent 18 months learning to hide AI from interviewers.
Then this happened.
Canva published a public engineering blog called Yes, You Can Use AI in Our Interviews. In fact, we insist. They expect candidates to use Copilot, Cursor, and Claude during the technical round. Half their frontend and backend engineers are daily AI users, so the interview now matches the job. They evaluate when and how candidates lean on AI, how they break down ambiguous briefs, and whether they can spot bugs in AI-generated code.
Sierra rewrote their entire onsite around AI. Plan, build for two hours with full AI access, then review. Codebase debugging interviews with PR drafts to improve via coding agents. Harlem Capital, Crux Analytics, BlueAlpha. Same move.
Not everywhere yet. Plenty of companies still proctor with anti-AI software, lock the browser, and watch your tab switches. The stealth trick has a market for now. But the companies that set the bench, the ones whose hiring practices get copied six months later, have already flipped.
And yes, AI doesn't always make you faster. METR ran a 2025 field study on experienced open-source devs. With AI active on real tasks, they ran 19% slower. They thought they'd be faster. They were wrong. The interview is built around judgment, not speed. When to lean on the tool and when to set it aside.
You learned to cheat just in time for cheating to become the test.
Two Catches Before You Sprint.
Today's skill is not tomorrow's leverage. Stefan Stern, visiting professor at Bayes Business School, gave Sherwood the cleanest pushback: "attitude is a more important consideration than today's aptitude". An employer that over-indexes on current vibe coding skills risks missing candidates who would learn faster and outpace them in six months. Smart hiring managers know this trap. They watch for the candidate who didn't ship the cleanest prototype but explained their reasoning best.
Skill atrophy is real. If you delegate every line of code and every decision frame to the AI, you lose the ability to judge what comes back. And judgment is exactly what the recruiters are testing. The candidate who shipped the prototype but can't say why they used the tool here and not there fails the interview just as hard as the one who didn't ship at all. I went into the move from gambling-with-AI to actually shipping in a method I built after enough vibe coding disasters, and the symptoms of skill atrophy are very specific.
What doesn't fabricate for free is judgment. The interview is built around that. C'est รงa.
The New Rules. Start This Week.
1. Pick one tool. This week. Not three. Lovable, Cursor, Claude Code, Replit, whichever. The choice matters less than the wrist time. Sit down with one of them on Saturday and build something tiny. Don't watch tutorials. Don't read comparisons. Build. The 8-step method in Vibe Coding, For Real was written for this exact problem because most non-devs spend three weeks "researching" and ship nothing.
2. Ship one tiny thing. Public. With a URL. A landing page that takes a form. A small tool that does one thing. Anything that has a public URL and survives a real user clicking on it. Recruiters can sniff a side project that lived for two days. They want to see something that survived its own deploy. If you want a quick checklist of what separates a real shipped thing from a demo that explodes the moment a real user touches it, the credit-burn guide on Lovable covers most of the failure modes for non-devs starting out.
3. Expect the cliff. Your first prototype will work. Your second one will explode. The vibe coding learning curve has a classic drop right when you go from "demo on my laptop" to "must stay up for 24 hours and not leak the database." It happens to everybody. The fault isn't you (it's the second little pig finding out straw doesn't hold against an actual wind).
4. Be able to say where you used AI and where you didn't. And why. This is the test Crux Analytics literally runs. Pick a small project, document your decisions, and rehearse a 90-second walkthrough. Where did you trust the first answer? Where did you push back? Pick the moment something broke and explain what you changed. Karen from Accounting is going to ask you that exact question in November, even if she calls it something else.
200 Got the Interview Today. 1,000 Will Tomorrow.
The shift isn't hypothetical, it's already running. The 200 vibe coders who got auditions this month aren't a spike, they're the new floor. Peter Grafe threw 190 PDFs in the bin without reading them and hired the one who shipped a prototype in five days.
The only question left is whether you're in the 200, or in the 800 nobody opens.
Open Claude Code this weekend. It's shorter than your cover letter.
Sources
- How AI is turning every job interview into a coding interview, Chris Stokel-Walker, Sherwood News, 6 May 2026: https://sherwood.news/tech/use-ai-interview/
- The State of Skills-Based Hiring 2025 Report, TestGorilla: https://www.testgorilla.com/skills-based-hiring/state-of-skills-based-hiring-2025/
- Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity, METR, July 2025: https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/
Top comments (0)