We often talk about the Pareto principle in engineering—80% of the work takes 20% of the time—but recently we’ve been looking at it from the other angle: the "Setup Tax." By our internal estimate, teams are spending roughly 80% of their budget on "undifferentiated heavy lifting" (infrastructure, config, boilerplate, plumbing) and only 20% on unique feature value. We built a model to see what happens to organizational capacity if AI tools actually succeed in inverting this ratio.
The results of the modeling were surprising. It’s not a linear efficiency gain; it looks more like a leverage multiplier.
The Problem: The Setup Tax
We've all felt this. You start a project, and before you write a line of business logic, you're wrestling with Terraform state, CI/CD pipelines, scaffolding, and environment config. This is the Setup Tax.
In larger organizations, this compounds with what I’d call the "Coordination Tax" (basically Brooks’ Law). Adding more developers often increases the overhead rather than the velocity because the complexity of the setup requires more communication to manage.
The Approach: Modeling the Inversion
We ran a "back of the napkin" economic model across three scenarios (Startup: 5 devs, Mid-size: 25 devs, Enterprise: 100 devs).
We assumed a theoretical shift based on AI handling the heavy lifting:
- Traditional: 80% Setup / 20% Features
- AI-Assisted: 30% Setup / 70% Features
Mathematically, this isn’t just a 50% improvement. If you go from spending 20% of your time on features to 70%, that’s a 3.5x increase in feature capacity (or a 250% efficiency gain). You aren't coding faster; you're just wasting less time on the plumbing.
When we ran this through the scenarios using standard salary data, the leverage effects got interesting:
- Startup (5 devs): The model showed roughly a 1,000% ROI on the tool cost, largely because small teams drown in setup.
- Enterprise (100 devs): We projected a ~440% efficiency gain.
The enterprise number is higher than I expected. My theory is that in large orgs, standardization is the bottleneck. If AI enforces the "setup" standards (boilerplate, patterns), it reduces the Coordination Tax. You spend less time debating folder structure or linting rules because the AI just handles it.
Trade-offs & Fuzzy Math
I want to be honest about the limitations here. This model assumes the AI foundations are solid. If the AI generates 70% feature code on top of garbage infrastructure, you haven't reduced the Setup Tax; you’ve just converted it into a "Debugging Tax," which is arguably more expensive.

We also tried to factor in "Developer Retention" (assuming devs are happier when they aren't fighting configs), estimating a 31% reduction in turnover. But admittedly, putting a dollar value on happiness is fuzzy math. It intuitively makes sense—nobody likes glue code—but it's hard to prove on a spreadsheet.
Open Questions
We’re treating "setup" as waste, but I’m curious if that’s the right way to look at it.
- Does the 80/20 split (setup vs. features) match your experience in 2024? Or has modern tooling already shifted this?
- Is "setup time" actually valuable learning time? By struggling through the config, do we understand the system better? If we automate that away, do we lose architectural competency?
- How do you measure "undifferentiated heavy lifting" without intrusively tracking every minute?
Ideally, I’d love to hear if anyone has seen this "inversion" actually play out in a real team, or if the maintenance overhead of AI code just eats up the savings.



Top comments (0)