DEV Community

Cover image for The Ceiling Just Moved. Most Engineers Haven't Looked Up Yet.
Carole
Carole

Posted on

The Ceiling Just Moved. Most Engineers Haven't Looked Up Yet.

You're probably using AI to write code faster. So is everyone else. That's not the edge you think it is — and I didn't realize what the actual edge was until I stopped optimizing for speed and built something from scratch.

Starting before I was ready

I came to engineering from journalism. For years I found stories inside noise, figured out what audiences actually needed versus what they said they wanted, and translated complexity into something people would care about. When I moved into tech, I learned quickly that those instincts were considered decorative. The real work was the code. Everything else was soft.

So I led with the technical and tucked the rest away.

When the AI boom hit, my instinct was the same one I've had at every career inflection point: study first, act later. I read comparisons, predictions, hot takes. I got more informed and less certain simultaneously — which, if you've been there, is a reliable sign you're consuming instead of learning.

The shift came when I got impatient enough to just make something. I wrote down the problems that genuinely frustrated me, picked one I cared about, and opened Cursor.

That move — from studying AI as a subject to using it as a collaborator — changed everything.

What I built

Every day I was drowning in content and starving for actual insight. Newsletters were too broad. Algorithmic feeds optimized for engagement, not relevance. Finding articles worth reading felt like a second job. As someone who spent years in newsrooms, I understand what it costs to miss important information. As an engineer, I understand what it costs to waste time. I wanted something that read the internet the way a sharp research assistant would — not "technology" as a category, but your specific context: API design, developer tooling, AI policy, whatever actually shapes your work and decisions.

So I built Clipper News: a personalized curation tool that scrapes articles from RSS, Hacker News, NewsAPI, and Reddit every two hours, ranks them against your preference profile using source reputation, engagement signals, and thematic relevance, generates AI-powered summaries via Claude, and delivers a digest on your schedule. A feedback loop refines the rankings over time.

This is what the MVP landing page looks like.

The stack: FastAPI on the backend, PostgreSQL, Redis with ARQ for background job scheduling, Next.js 14 with Tailwind and shadcn/ui on the frontend, Auth0 for authentication, Stripe for billing across three tiers (free trial, Pro, Team), and SendGrid for email delivery.

Two integrations I'd never touched before — Auth0 and Stripe — were both functional within hours. Auth0's documentation was genuinely impressive: clear, well-structured, designed for developers hitting it for the first time. Stripe was the same — their guides walked me through checkout sessions, customer portal setup, and webhook handling in a way that felt like the product wanted me to succeed. It was a reminder that good documentation is itself a product decision, and one that directly affects adoption.

The MVP works. Production hardening and live deployment are still ahead. But the core does exactly what I set out to build.

Categories of topics you can select on Clipper.

What your curated digest look like.

But the most interesting part of the build wasn't the stack. It was what happened to the way I think.

The question that changed how I think about AI

Building alone means every decision lands on you. Not just the technical ones — the ones that determine whether something is useful.

Which database makes sense for this product now, and what needs to change when it scales? What belongs behind a paywall and what should stay free to build early trust? How should the ranking algorithm balance recency against thematic relevance when they conflict? What does a first-time user need to see in their first digest — and what will make them close the tab?

A quick note on how this works in practice: Cursor has two distinct modes. Agent mode takes a goal and executes — it writes code, runs commands, edits files, moves fast. Plan mode slows that down on purpose. It reasons through the problem first, asks clarifying questions, maps out what it's about to do and why. I stopped reaching for agent mode by default. Plan mode is where the real thinking happens.

My first

These aren't implementation questions. They're judgment calls. And inside most companies, they get made long before the work reaches an individual engineer.

What Cursor's plan mode gave me wasn't faster code. It was a thinking partner available at every layer simultaneously — architecture, product logic, user experience, pricing, API constraints — so I could hold the entire product in mind at once instead of moving through each domain sequentially. The switching cost between developer, product manager, UX designer, and QA engineer dropped low enough that I could be all of them in the same working session.

Not because AI gave me skills I didn't have. Because it removes the cognitive overhead that normally forces you to think about these things one at a time.

That's when I realized: the real constraint on what most individual engineers can build has never been technical skill alone. It's been a cognitive load — the sheer cost of coordinating across domains. The organizational structures we work inside evolved to manage that cost, and in doing so, they created a world where deep specialization was the only way to go deep on anything.

AI is loosening that constraint. Not eliminating it. Depth still matters enormously. But loosening it enough to change what's possible for a single person who's willing to think across the full surface area of a problem.

The shift beneath the productivity conversation

The dominant narrative about AI and developers is about speed. Shipping velocity. Lines of code per hour. Role compression. That conversation is real, but it's focused on the wrong layer.

The more significant shift is what AI does to the ceiling — the upper bound of what one person can seriously engage with. For most of the history of software, that ceiling was set by specialization. You owned your domain, and the coordination cost of touching anything beyond it was high enough to make the boundary rational.

That boundary is moving. Which means the question changes. It stops being "can I build this?" and starts being "do I understand the problem well enough to know what's worth building, and for whom?"

That's a harder question. It requires understanding who uses something and why, not just how it works. It requires making judgment calls when the technical path is clear but the product direction isn't. It requires caring about the architecture and the user experience and the business model, and seeing how each one shapes the others.

The engineers getting the most from this moment aren't the ones with the deepest knowledge of any single tool. They're the ones willing to work at the edges of what they know — to start before they feel ready, and to let the work close gaps that preparation alone never would. They're not abandoning their depth. They're using it as a foundation to move laterally in ways they couldn't before.

The beginner's mind isn't a soft skill. It's a professional one.

What I'm taking away

I built something useful, with a stack I'd never fully worked with, making every product and architectural decision myself — and the thing that made it possible wasn't any single technical capability. It was the combination: the engineering foundation, the product instinct from years of understanding audiences, the ability to ask what is this actually for and let the answer drive the decisions.

That combination isn't rare because it's hard to develop. It's rare because engineering culture hasn't historically rewarded it. AI is changing that faster than most job descriptions have caught up with.

The ceiling moved. The only question is whether you've looked up.

If you're building something, questioning something, or just trying to make sense of where this industry is heading — I'd love to connect. Find me on LinkedIn or leave a thought below. Views expressed are my own and do not represent my employer.

Top comments (0)