You ask an AI to write a function that adds two numbers. You get back a factory pattern, a dependency injection container, three abstract classes, and a README.
I've been watching this pattern play out for months now, and it's genuinely hilarious — until you realize people are shipping this stuff to production.
The Enterprise Pattern Reflex
Here's what happens: you prompt an AI coding tool with something dead simple. "Write a Python function to read a CSV file." A human would give you five lines. The AI gives you a CSVReaderFactory with configurable parsing strategies, an abstract DataSource interface, custom exception hierarchies, and a config loader that reads from YAML.
It's not wrong, exactly. It's just absurdly overkill.
The AI treats every problem like it might need to scale to a million users by Thursday. It doesn't know your context. It doesn't know this is a throwaway script to clean up a spreadsheet for your boss. So it defaults to the most "professional" patterns it absorbed from thousands of enterprise codebases during training.
Why This Actually Matters
Over-engineered code isn't just funny — it's expensive. Every unnecessary abstraction layer is a maintenance burden. Every design pattern applied without justification is a cognitive tax on whoever reads the code next (probably you, six months from now, confused by your own architecture).
Microsoft's internal data shows AI-generated code tends to be 20-30% more verbose than human-written equivalents. That verbosity isn't free. More lines means more surface area for bugs, more time in code review, and more documentation nobody will write.
There's a real skill to knowing when something should be simple. A senior developer's superpower isn't knowing all the patterns — it's knowing which ones to not use. AI hasn't learned that yet.
The Irony
The whole promise of AI coding tools is "move faster." But if you blindly accept the over-engineered output, you're building yourself a maintenance nightmare. You saved 10 minutes writing the code and created 10 hours of future debugging.
I've started treating AI suggestions the way I treat Stack Overflow answers from 2014: useful starting points, but I strip out about 60% before committing anything.
What To Do About It
Be specific in your prompts. "Write a simple function, no classes, no patterns" works surprisingly well. Tell the AI what you don't want.
Ruthlessly simplify. If you can delete an abstraction layer and the code still works, delete it. Every time.
Ask yourself: would I have written this if I had to maintain it alone? If the answer is no, it's probably over-engineered.
The best code isn't clever. It's boring, readable, and does exactly one thing well. AI will get better at knowing when to keep things simple. Until then, that's your job.
Top comments (0)