Listen to this post on Substack
Artificial Intelligence is exciting—the technological breakthrough that took over our lives shows no signs of slowing down.
Acknowledging the problematic copyright issues that stem from AI usage, I cannot deny that it is a powerful tool when placed in the right hands for the right purpose—but with great power comes great responsibility.
Amid the excitement and tech moguls' gold rush to the next model, have we stopped to consider the environmental cost of running these models—with their hundreds of billions of parameters—for hours on end, scaled across multiple instances, serving millions of users simultaneously?
As a long-time advocate of the "minimal web" concept, I've noticed an oxymoron: On one hand, all of us are eager to run towards technological advancements (e.g. AI). On the other hand everyone is concerned about the planet's health—But only few consider the conflict between the two: the massive energy costs of AI—is it efficient? Is it green? Far from it.
AI Energy Consumption
Can we calculate the energy cost of AI models available as services online? The truth is—we can’t, not without access to their datacenters. But—we can come to a pretty realistic conclusion.
Using my older GPU (equivalent to a GTX 1060), it took 10 minutes to generate an image using Stable Diffusion. My online research showed that the Nvidia A100 is the most commonly used GPU for AI/ML in the cloud, which consumes up to 400W under load.
The Nvidia A100 is a very powerful GPU designed specifically for AI/ML execution. Based on online benchmarks I’ve found, it can generate an image using Stable Diffusion in under 5 seconds.
Using the formula E = P * T, running an Nvidia A100 at load for 5 seconds consumes approximately 0.5 watt-hours.
Midjourney and similar AI image generation services typically generate 4 images per prompt. Assuming these services use an array of Nvidia A100s, we can estimate they consume 2 watt-hours of energy per prompt, per user. As of January 2025, Midjourney has nearly 20 million daily active users.
2Wh x 20,000,000 Users x 24 Hours = 960,000,000Wh = 960,000KWh
Based on these calculations, services like Midjourney consume around 960,000 kWh per day, assuming each user executes one prompt per hour in average. In reality, the actual energy consumption could be higher.
This amount of energy alone could power over 25,000 average-sized households per day.
Keep in mind that our example only touches image generation services. Let's not forget about other AI services like ChatGPT, Gemini, Claude, Deepseek—and all their versions and variations. The list, and energy waste, continues to grow.
How can we, as developers, researchers, and entrepreneurs, collaborate to address this emerging issue? What steps can we take to promote efficient and responsible energy use when it comes to AI? I'd love to hear your thoughts on this challenge and potential solutions in the comments below.
Top comments (15)
This topic is so close to me, wrote couple of articles . Sharing the link here
DarkSIDE of AI : Power Hungry process
Bala Madhusoodhanan ・ Jun 24 '24
Carbon-Aware Software Development
Bala Madhusoodhanan ・ Oct 7 '24
Nice! I didn't even go into the cost of training the models, as you did. Great perspective, I think we should try to raise awareness of this topic in the community.
I think there were some news not that long ago about Big Tech making plans to build nuclear power plants to supply electricity for AI? Am I wrong?
Haven't heard anything about it, but I don't like it already (surely I'll find a solid reason not to like it in the future).
Well, I might be wrong, don't remember where I got that from and I feel too lazy to search...
Anyway, you guys seem interested in the topic, I had this (somewhat related) idea about a post, but I'm not that good at writing... I think you might be better prepared to write about it, so, I'll share it here - in case you might want to take it further (cos I most probably won't :)
Just a bit of a warning - this is what my cynical mind's been telling me for the past year or so... :)
Essentially what we're seeing with GenAI (Nuclear power plants or not) is Wall Street to the fullest.
If you've got some billions, and a deep understanding of Wall Street you can do a couple of things:
In that respect - I strongly believe there are a few very Wall-Street-smart people behind GenAI who've realized the world is ready for AI stock. And they don't have a problem to burn billions subsidizing AI startups (including subsidizing the cost of each request), or to engage all the internet tabloids and influencers who this is the next big thing that will revolutionize everything.
Because they know that whatever goes down the drain of an unprofitable enterprise will be recovered many times over if they can convince Wall Street traders that indeed - if they buy shares now there will be much higher demand for the stock in the future.
Just raise expectations through exposure on all sorts of media, subsidize all costs - so that users don't have to think is this really worth it? - get traction, make bold statements like we've got this better than others, because we'll invest another 50 billion. ... and so on... Everything you can possibly do to show your shares will be selling higher in the future.
Favourite example of mine - the previous 'next big thing' - companies like Uber. Last time I checked - some $33bln have gone down the drain without any signs of a potential break-even. At the same time - market capitalization on Wall Street - $150 billion. So, technically - you can waste billions on a company that will never break even - as long as you make 5 times more on Wall Street.
And if you look at the amount of content about GenAI online at the moment, the amount of 'influencing for' GenAI - it's significantly more that what Uber got back then... So, I think for someone Wall-Street-smart there's no issue to spend more ... like - a hundred billion - for what could potentially be a trillion-dollar IPO.
Bear in mind in this scenario it doesn't matter how useful the service is really, will it ever make profit... or how much it costs, how sustainable it is or really will it still be around when the 'next big things comes along'. (Personally I'm waiting to see where will Uber go if money starts moving out of it and into GenAI.)
What matters is convincing investors that by investing more you're more likely to win the race. (Unless the Chinese come along and troll you ;)
And going back specifically to nuclear power plants - in the above context - if you announce you've got the money to subsidize a nuclear power plant for your AI 'next big thing' - you do look much more dedicated... :)
I know, it's weird, but I strongly believe that's the way it works these days... :)
It always comes back to this in the end. It doesn't matter whether your product is a quality product, whether your company is profitable, or even if people want your product or not (we were completely fine before LLMs).
The only thing that truly matters is how good your pitch is, how much shareholder money can you circle around, and how big your media coverage is.
OpenAI (or should I say ClosedAI?) are a perfect example of a company that burned $17B to create a product--mediocre at best--without any technological edge. Deepseek is proof that any company with enough GPUs can do what ClosedAI did, for cheaper. That being said, Altman is an expert salesman, but the emperor has no clothes.
This reminds me of the revolving door concept, but instead of government and regulators, it's shareholders and CEOs.
I think writing a full article about this topic is too fringe for the masses, honestly. People have a hard time accepting the truth, even if it's right in front of them. The past few years have proved it, and I have stopped trying to convince people to see the world the way I see it.
I still (sort of) believe some counter-influencing is good, but more and more often don't feel like actually doing it :)
Generally, I agree, but I admit I've exhausted myself trying to do it :)
Why did you write this post then (if you don't mind me asking)?
Good question, I am not sure. I crunched some numbers a while ago and thought it would make an interesting post.
Just to clarify about my earlier comment, what I meant was that I've exhausted myself trying to convince un-like-minded-people, which are the majority. This mostly happened face to face, with people who I presumed were wise enough to see what I see, but alas... I then realized intelligence is subjective, and having a master's degree does not necessarily mean also having critical thinking skills.
Online, it's a different story--if they want to read it and are "my sort of people"--great, if not, they don't have to.
This question is also interested me personally. Including the copyright question also.
I paste my case: I am started working with Midjourney at 2024/12/06 until now I created a 2800 task which means 11.200 image. Which means: 204 images / day. Of course this is a intensive development process.
But as you readed my project:
dev.to/pengeszikra/javascript-grea...
I used 8 different AI paralell to reach my goal. Maybe midjourney used most intensive.
After I submit my game for a hackaton I try to summarize what I am think about copyright: dev.to/pengeszikra/stolen-content-...
I hope this is help to measure a intensive AI using by one developer.
In a discussion I had recently with a friend of mine I made a joke that we should put online a service where people can host their own AI models - at true cost - chips, infrastructure, electricity, research and everything. :)
This will surely be a huge flop :)
Not so sure about
huge
- I think it might be doable for something likeee... 20 euros/month... :)cool article