The standard technical interview tests whether you can reverse a linked list on a whiteboard. The actual job requires understanding a 200K-line codebase and making safe changes to systems you didn't build.
The gap between interview skills and job skills has always existed. AI made it wider.
What Changed
Developers with AI tools can solve LeetCode problems trivially. Copilot handles the syntax. Claude handles the algorithm. The "hard" interview question is now a 2-minute prompt.
But the hard part of the actual job — understanding unfamiliar code, tracing dependencies, making changes without breaking things — AI can't do alone. It requires judgment, context, and system-level thinking that no amount of autocomplete provides.
What Interviews Should Test
1. Codebase Navigation
Give the candidate an unfamiliar repo and a bug report. Watch how they find the relevant code. Do they grep intelligently? Do they trace call paths? Do they check git blame for context?
2. Dependency Reasoning
"If we change this function's return type, what breaks?" This tests whether they think about downstream effects — the skill that prevents production incidents.
3. Context Gathering
Present a ticket with ambiguous requirements. How do they clarify? Do they identify the unstated assumptions? Do they map the blast radius before planning?
4. AI Tool Usage
Let them use AI tools during the interview. The question isn't "can they solve it without AI?" It's "can they use AI effectively to solve a problem that requires codebase understanding?"
The Meta-Skill
The most valuable developer skill in 2026 isn't writing code. It's understanding systems well enough to tell AI what to write. Interview for that.
Originally published on glue.tools. Glue is the pre-code intelligence platform — paste a ticket, get a battle plan.
Top comments (0)