A submission for Weekend Challenge: Earth Day Edition
Let me be perfectly clear right up front: I am not participating for any prize money or category. I am not building a massive AI tool or spinning up a blockchain node today. I am here purely as a participant to claim the completion badge and make a blunt point about how we build the web.
Every tutorial pushes us to use heavier frameworks, more API calls, and bloated libraries. But data centers run on electricity, and electricity often comes from fossil fuels. So, for Earth Day, my "project" is a practical teardown of web bloat.
What I Built
I didn't build a new application that will sit on a server burning energy for no reason. Instead, I built a Lean Web Architecture Standard for my own future projects.
The goal is simple: reduce the carbon footprint of the websites we build by cutting out the junk. As a developer currently deep into HTML, CSS, JavaScript, and PHP, I realized that relying on vanilla code instead of dragging in 500MB of node_modules is the most eco-friendly thing I can do right now.
My standard focuses on three things:
- Zero-dependency frontends wherever possible.
-
Aggressive caching via server configurations (like Apache
.htaccess). - Asset compression as a default, not an afterthought.
Demo
Since this is a methodology rather than a deployed web app, the "demo" happens in the terminal. Look at the difference between a standard boilerplate React app and a lean, vanilla PHP/JS setup for a simple storefront.
# The Bloated Way (High Energy Cost)
$ npx create-react-app bloated-store
$ du -sh bloated-store
245M bloated-store/
# The Lean Way (Low Energy Cost)
$ mkdir lean-store && cd lean-store
$ touch index.php style.css app.js
$ du -sh lean-store
12K lean-store/
The server requires significantly fewer CPU cycles to serve 12 kilobytes of static/lightly-rendered files than it does to compile and hydrate a massive JavaScript bundle. Less CPU = less power = lower emissions.
Code
Here is an example of what this looks like in practice. Instead of using a heavy JavaScript library to handle routing or basic UI states, we can handle it at the server level efficiently.
This is a snippet from a standard .htaccess file I use to clean up routing and enable caching. Caching is arguably the best "Green Tech" because it stops the server from doing the same work twice.
# 🌍 Enable Cache Control for Eco-Friendly Browsing
<IfModule mod_expires.c>
ExpiresActive On
# Cache images for a year (reduce redundant network requests)
ExpiresByType image/jpg "access plus 1 year"
ExpiresByType image/jpeg "access plus 1 year"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/png "access plus 1 year"
ExpiresByType image/webp "access plus 1 year"
# Cache CSS and JS for a month
ExpiresByType text/css "access plus 1 month"
ExpiresByType application/javascript "access plus 1 month"
</IfModule>
# Clean URLs to avoid complex backend parsing
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^products/?$ products.php [L]
How I Built It
My technical approach is rooted in subtraction.
When you strip away the noise, you are forced to understand the underlying systems better. I started looking into how Apache handles requests and how PHP interacts with it. By setting up strict caching rules and clean routing directly on the server, the client's browser doesn't have to work as hard, and the server doesn't have to repeatedly query the database or re-render identical pages.
I also made it a personal rule for my own projects: always serve WebP images. JPEG and PNG files are unnecessarily large. A smaller payload means less data transferred over the wire.
I didn't use any of the prize category technologies. I didn't need Copilot to write this, I didn't need Gemini to generate it, and I definitely didn't need a Solana blockchain to prove it. Sometimes, hard truths and vanilla code are enough.
Prize Categories
None. I am intentionally opting out of all prize categories.
Top comments (0)