The longer I write software, the more my sense of “impressive” changes. What actually amazes me these days isn’t modern technology, but older systems — and the people who built them.
Take Windows 95. A full operating system from almost 30 years ago. GUI, drivers, multitasking, multimedia, process and thread management. All of it lived in roughly 50 MB on disk and ran on 4–8 MB of RAM.
Now compare that to today. The browser tab I’m using to type this text currently takes over 1 GB of memory. I’m not compiling anything. I’m not rendering video. I’m just editing text.
That number alone would horrify engineers from the 1980s — people who ran full multi-user Unix environments on machines with 2 MB of RAM and 20 MB hard drives. Entire development workflows — editors, compilers, networking, users — fit inside constraints that feel impossible now.
Even small things today feel heavy. A simple “Hello, World” after activating a virtual environment can easily pull in tens of megabytes of libraries before any real logic runs. Not because the problem is complex, but because the ecosystem around it is.
The Disappearing Discipline
What surprises me isn’t that hardware became faster. That was inevitable. What surprises me is how abundance changed our behavior. We lost our software manners.
The constraint of scarcity once enforced an unwritten code of conduct:
- Memory was precious — you cleaned up after yourself
- Every cycle counted — you thought before you looped
- Dependencies were earned — you didn’t pull in libraries for trivial tasks
- Abstractions were understood — you knew what happened under the hood
Old systems weren’t magical. They were constrained — and that constraint forced discipline. We’ve traded that discipline for convenience.
The Professional Paradox
Here’s where it gets professionally painful: The system now rewards waste.
If you don’t use the tonne of libraries, cloud SDKs, and abstraction layers that consume resources (and cash) at runtime, you risk being passed over for Developer B who doesn’t care. Developer B is the “deliverer” — they ship fast, consequences be damned!
The metrics are stacked against careful craftsmanship:
- Velocity > Efficiency
- Features shipped > Resources consumed
- Time to market > Technical debt considered
- Framework familiarity > Understanding fundamentals We’ve created a world where the most “productive” developer is often the one who piles abstraction upon abstraction, dependency upon dependency, until the entire structure becomes so bloated that it requires hardware upgrades just to maintain parity; and inflates cloud costs.
The Cost of Bad Manners
This isn’t just about nostalgia. The consequences are real:
- Environmental impact: We’re burning megawatts to run inefficient software that does simple tasks
- Accessibility erosion: Software that requires the latest hardware excludes users with older devices
- Security fragility: Layers of dependencies create attack surfaces we don’t understand
- Innovation stagnation: When all our energy goes into maintaining bloat, we have little left for genuine breakthroughs. The engineers who built C++ on a 2 MB PDP-11 weren’t just clever — they were considerate. They considered the hardware, the next programmer, the user’s resources. That consideration was their professional ethic.
Relearning Our Manners
So yes, we’re losing our manners. But manners can be relearned. It starts with small acts of consideration:
- Question dependencies: “Do I really need this 50 MB library for a simple task?”
- Profile relentlessly: Know what your code actually does, not what you think it does
- Understand one layer down: Know what happens beneath your abstraction
- Advocate for efficiency: Make performance a feature, not an afterthought The most impressive software isn’t what uses the most resources — it’s what accomplishes the most with the least. That discipline, that consideration, that professional courtesy toward the machine and the user — that’s what we need to reclaim.
Because in the end, software development isn’t just about making computers do things. It’s about how we choose to exist in a world of limited resources. And good manners, it turns out, are just as important in code as they are in life.



Top comments (4)
That is the negative consequence of over-provisioning.
Developers get fast machines and fast internet connections, and then they assume everyone else has access to the same means.
The same is happening with hosting, even websites that are local want fast page loads on the other side of the world. And because of hyperscalers like AWS, Azure and Google cloud it is possible.
Don't get me started on AI, people are expected to have multi gigabyte models on every device they own.
The tech industry needs to do less disrupting and add more value.
Thanks for your post! I mostly agree, but it's funny that you cite Windows 95 as a good example, that unstable pre-NT desktop operation system built on top of MS-DOS that needed 9 floppy disks to install. Software like Windows 95 was one of the reasons that the Linux community developed desktop environments, because developers were craving for stable alternatives to Microsoft, and Steve Jobs hadn't yet decided to bring his NeXT step ideas back to Apple.
I appreciate memory saving and efficiency right now and back then, but I'd rather point to home computer game development in the 1980s or the 1990s demo scene instead. Now you can retort and point out what an idiosyncratic chaos Commodore assembly development meant ;-)
This really resonates with me. I’ve been building software for a long time, and I recognise that shift you’re describing — from being careful and intentional, to assuming the machine (or the cloud) will just absorb whatever we throw at it.
A big reason I created myapi.rest was actually because of this. I kept seeing teams re-implementing the same small utilities over and over, often in rushed ways, with little thought for performance, limits, or long-term maintenance. It felt… a bit careless.
For me, software manners is about respect — for the system, for the next developer, and for the future version of yourself. Even when things are “cheap” or abstracted away, I still believe there’s value in doing the small things thoughtfully.
I think your post brings a few interesting topics up.
There is probably a balance to be struck between importing the world with little regard to consequences, and optimizing performance down to the last byte.
Different applications in different contexts will probably land on different points on this spectrum, for example if we are building a prototype webapp to show a concept to a user, it would probably be on the less optimised with less good "manners".
Whereas an application where the concept is more concrete and the focus is on delivering performance that is good enough for now and the new couple of years would probably fall closer to the other end with better manners and more optimisation.
Sometimes even within the same application you might have different parts at different points on the spectrum, say we are a company whose USP is being an insurance broker. We'd probably want to make sure that the insurance broker bit of our application was put together very well and we spend most of our time there, vs spending less time building supporting functions like an Admin UI, or Reporting, we'd still have them to a good enough quality, but probably use more off the shelf type tooling for those bits as that is not where there core of our business is.