Almost a year ago, I wrote an article "Rise of Local LLMs?", which was well received. Now feels like the perfect time to revisit this ideaāespecially after recent events that have shown open-source AI overtaking proprietary models.
Chapter 1: AI, Global Politics, and the Battle for Control šļøš¤
In many ways, AI today is being treated like the atomic project at Los Alamos. If youāve read "Supremacy" by Parmy Olson, youāll see a familiar patternāa group of researchers driven by the belief that they are "saving the world."
The difference? This time, itās not governments racing to build the next atomic bombāitās corporations. And if history has taught us anything about the internet, itās that money matters more than anything else. š°
Tech giants are striving for full control. Imagine a world where only the U.S. had nuclear weapons. Would history have played out the same way? Now apply that thought to AI.
The first prototypes of any technology are always expensive and resource-intensive. But eventually, others catch up and make it cheaper and more efficient. Weāve seen this happen with space travel (š SpaceX, ISRO) and now with AIāDeepSeek is proving that local models can challenge industry giants. And this is just the beginning.
DeepSeekās Checkmate āļø
If you havenāt been in a coma, youāve probably heard of DeepSeek by now. But what wiped out $1 trillion from the U.S. stock market wasnāt just that it hit #1 on the Apple Store. š
It was something far bigger:
DeepSeek became the first open-source reasoning model. š
We've seen the U.S. ban TikTok and label it a "Chinese spy app," but DeepSeek is different. If you donāt trust it? Run it locally.
Still worried about it being a Chinese spy tool? Read the research paper and train your own model.
The real shocker? Companies like OpenAI, Google, and Microsoft have spent billions on AI development, but DeepSeekās success proves that state-of-the-art AI can be built for under $10 million without needing NVIDIAās expensive chips. That changes everything.
Whatās Next? ā³
OpenAI has already accused DeepSeek of using its models to train their AI. That drama is just beginning. Expect to see the U.S. try everything to discredit DeepSeekās achievement in the coming weeks.
But hereās the bigger picture:
The open-source AI community is thrilled, with notable figures like Yann LeCun from Meta and organizations such as the Linux Foundation praising the shift toward decentralized AI. Many developers and independent researchers have taken to forums and social media, celebrating the new wave of AI democratization and the potential it brings for global collaboration. Weāre about to see more efficient, cheaper, and accessible models in the coming monthsāmark my words. š„
Just recently, Alibaba dropped another powerful open-source LLM:
š Qwen 2.5 Max
And Moonshot AI released:
š Kimi Ki 1.5
āThere will be another DeepSeek by the end of this month, and many more this year in the U.S. and worldwide,ā
ā Jim Zemlin, Executive Director of the Linux Foundation
This is the new normal. Open-source AI isnāt going anywhere. And honestly? Thatās how it should be. š
The Revolution Isnāt Coming⦠Itās Here. š
The rise of local LLMs is inevitable. Governments and corporations may try to control AI, but the open-source community has already proved that AI belongs to everyone.
Another side of the debate
WARS and AI āļøš¤
In "Supremacy" by Parmy Olson, the DeepMind team once feared that Google might use their AI technology for the U.S. military. At the time, there was a company-wide protest, and the project was eventually shut down.
Fast forward to today: amid the Israel-Gaza conflict, reports suggest that Google has provided AI assistance to the Israeli military. Meanwhile, AI is also being used in the Ukraine-Russia conflict, with both sides leveraging AI-driven strategies.
The reality is clearāthere is no stopping AI from being used in warfare. We are already living in that future.
Check this Video for more on that
This is yet another reason why technologies like these should never be controlled by a single nation or military power. Open-source AI ensures that knowledge and advancements are distributed, rather than being hoarded for geopolitical dominance.
Chapter 2: Why Open-Source AI Models Running on Your Hardware Are the Future? š»š
Privacy is the obvious reason to run AI models locally, but letās be realābillions of users willingly share their data on Instagram, YouTube, and TikTok every day. So, itās safe to say most people donāt actually care about privacy anymore.
But thatās not the only reason local AI is the future. Letās explore some key factors.
1. The Cost Factor š°
According to this article, here are the most common use cases for ChatGPT:
- š Homework Assistance
- š¬ Personal Advice & Communication
- š Travel & Lifestyle
- š Sexual Content (Failed attempts š¤¦āāļø)
- š„ Coding Help
Apart from the last one (coding), do you really need a state-of-the-art reasoning model for these tasks? Absolutely not.
Instead of paying for expensive AI subscriptions, youād be better off using smaller models locally with tools like Jan AIācompletely free!
If your goal is just to get horny with AI, for godās sake, at least use a fine-tuned model built for that. š
Save Money on AI Coding Assistants š§āš»
For coding, DeepSeek R1 is now available to run locally, making it a solid alternative to GitHub Copilot or Cursor. That means you can cancel your subscriptions and still get top-tier AI assistance.
š Check out this Dev.to article to set it up in Visual Studio Codeāor, if you're feeling adventurous, build your own AI plugin in under 5 minutes!
2. The Environmental Impact š±ā”
Did you know that a single ChatGPT query consumes 10-100x more energy than a locally run LLM response? Thatās because cloud-based AI models rely on massive compute clusters running 24/7.
Meanwhile, running a quantized 7B LLM on your laptop consumes just 20-50Wāabout the same as browsing the web.
To put it in perspective, hereās a side-by-side comparison:
š Energy Consumption
| Factor | Local LLMs šæ | Cloud-Based LLMs āļø |
|---|---|---|
| Compute Power | Runs on consumer hardware (low power) | Requires massive data centers |
| Efficiency | Optimized for single-user inference | Thousands of power-hungry GPUs |
| Scalability | Less scalable but efficient | Highly scalable but wastes idle compute |
| Carbon Footprint | Lower (if optimized) | Very high due to cloud operations |
š Which One is More Sustainable?
| Factor | Local LLMs (Edge AI) šæ | Cloud-Based LLMs āļø |
|---|---|---|
| Energy Efficiency | ā Lower power use | ā High due to data centers |
| Hardware Sustainability | ā Uses existing hardware | ā High GPU demand & e-waste |
| Data Privacy & Network | ā No internet needed | ā Heavy data transfer |
| Carbon Footprint | ā Lower | ā Higher (always-on servers) |
Read more on AIās environmental impact here
So next time you use a cloud-based LLM just to role-play as a medieval knight or write a poem about your ex, maybe think twice. š
3. Speed & Latency š
One of the biggest advantages of local AI models is speed. When you run an LLM on your own hardware, the response time is instantaneous because:
ā
Thereās no internet latency ā no data needs to travel back and forth to a remote server.
ā
You bypass API limits ā no rate-limiting, no throttling.
ā
Your AI responds as fast as your hardware allows ā no waiting on cloud server congestion.
Building Small Apps running on consumer hardware instead of relying on Cloud Apis is much faster and cheaper.
Final Thoughts
Between cost savings, privacy, and sustainability, running AI locally on your hardware isnāt just a geeky alternativeāitās the future.
I know itās exciting, but many will argue that hardware requirements are still a bottleneckāand I wonāt deny it. Not everyone can afford high-end GPUs to run these models offline. But hardware is evolving fast, and soon, this will no longer be a limitation.
Take Appleās new Mac Mini M4āit packs an impressive AI-capable chip at a reasonable price. Meanwhile, NVIDIA is working on making its chips more efficient and accessible, knowing that the future of AI isnāt just in data centers but in everyday devices.
Soon, AI wonāt just be something you access through a cloud APIāitāll be embedded in basic laptops, phones, and even IoT devices. On the other side, LLMs are becoming more optimized for todayās hardware, making it possible to run powerful AI without needing a supercomputer.
The DeepSeek story proves that the future isnāt about a single dominant AI or an AGI (Artificial General Intelligence) that does everything. Instead, itās about specialized models working togetherāwhere you can pick and choose the best models for specific tasks and build applications on top of them.
We are entering the era of multi-modal AI systems, where applications will combine multiple specialized AI models to create something far more powerful than a single, monolithic AI.
Beyond that, we're seeing:
- On-device AI acceleration š ā Smartphones, gaming consoles, and edge devices are integrating dedicated NPUs (Neural Processing Units) to run AI locally with minimal power consumption.
- Open-source hardware initiatives šļø ā Projects like RISC-V AI chips are working toward affordable, open AI processing units that break dependency on proprietary silicon.
- Federated AI š ā A system where AI models train across multiple decentralized devices, improving efficiency while maintaining privacy.
- Energy-efficient AI ā” ā The rise of low-power AI models designed to run on minimal hardware while achieving state-of-the-art performance.
- AI compression & quantization š ā Techniques that significantly reduce model size and computational requirements, making high-quality AI accessible on everyday devices.
- Community-driven innovation š¤ ā Open-source developers constantly improving and sharing models, ensuring that no single corporation can monopolize AI.
Upcoming
In the next article, weāll explore:
š„ Where to get started with local AI models
š” Ideas for your own Open-Source AI project using offline models
Isnāt this exciting? Whatās your take on local AI? Letās discuss in the comments! š Do you see yourself experimenting with local AI models? Have any cool ideas in mind? Drop a comment belowāIām all ears!
If you want to discuss a project idea, need guidance, or just want to chat about AI, hit me up on Twitter @sarthology. Letās build something awesome together! š




Top comments (24)
This is a great breakdown of how open-source AI is changing the game! The comparison to the atomic race really hits homeābig tech is fighting for control, but DeepSeek and others are proving that innovation can happen outside of billion-dollar labs.
Iām especially interested in the local AI angle. The cost, speed, and energy savings make a strong case for running models on personal hardware instead of relying on cloud-based systems. Do you think weāll see a real shift toward decentralized AI, or will corporations find a way to keep control?
Well, That comes down to people like you and me in the end. If we start building apps around Local LLM ecosystems, others will follow.
But here is why I think that offline LLMs will win. Thinking about future, Do you think robots will work great if they rely on cloud computing too much? Or let's say one day you want to wander around in forest with an AI helping you understand local plant spices, You would want it to work offline right?
As the Auther has covered great inside and outsides related to open and private LLMs.. much appreciated...
Special mention to Fireship - what a great video content channel about tech
Now coming to this Deepseek and AI discussion... this is a great positive change... toward Job sector fear... why i am saying this... from last 3 - years AI was Top most discussion like nothing beyond it and people started setting up their limitation.. big business were making money from this fear... i am not saying deepseek will not do such thing in future... Who know how Github was baught by Microsoft... someone will definitely come in future to make this private...
But overall this is good change... to make people belive there is no end of your thought ... definitely changes will be there and we will need to adapt it... but not in fear.. but in taking it as new upgrade to become better and making world better
Good wishes... :)
It is free if you go for Local models (No hidden cost as of now)
Give it a try.
Okay let me try
good luck
Now look i don't really believe that deepseek cost less than 10 million USD to produce. But that's not important. The Chinese have tripped America at their own game by throwing out a truly accessible model for everyone, thereby dropping a bomb on the stock market. I have a feeling that it's only a matter of time before the bigger players open up sources. Right now, Open(closed)AI looks more suspicious than Chinese (espionage) Deepseek ;)
True, what Elon musk couldnāt do china will. Making Open Ai truly Open. š
The cope from USA is so real. š
My favorite tweet on the levels of cope we're seeing.
Love the democratizarion of new tech!
i think whether you like chinese or not, it gave me options when chatgpt or claude not available now i can just use deepseek, qwen, or hailuo thanks my chinese spy friend, already sending my data to chinese since 1969 coz i use xiaomi lol my cctv also v380 pro which is chinese brand š¤£
š¤£š¤£š¤£
Trust me most the tech is one way or another made in China š
youre not even wrong even a little bit š¤£š¤£
Yup and so we are doing just fine. š
So their will be stealing either itās china or USA or your own government š
Exactly my thoughts š¤£
This article is š„! I really enjoyed the breakdown of DeepSeekās impact and the broader discussion on open-source AI vs proprietary models. The way you connected this to past tech revolutions, like the space race, really adds depth to the discussion, and DeepSeekās approach truly feels like a groundbreaking shift in AI..
Excited to see where this revolution leads! Thanks for sharing such an insightful piece. šš„
Thanks Alok. š«¶
Interesting article!
It is indeed cheaper for the world to host your own llm.
Cool to read this, because today I wrote an article about self hosting a deepseek-r1 distill at home
Thanks
And great article mate, cheers š„
@sarthology thanks for writing and sharing your thoughts.
What do you think happens to all $$ investment by hyperscalers and VCs in AI startups? I see the Dot-com bubble pattern here.
Well I'm no Financial Expert But I can tell you one thing. Big tech may have thought that Big Money can act as business barrier for their AI. That gamble isn't working any more. If it's all actually a bubble, deepseek may have pinched it.
Thanks! DeepSeek definitely challenges the myth of needing the best and most expensive compute for effective AI models.
Exactly, Work smart.
Damn! š„ It was beautifully described each and everything be it AI, geopolitics everything. Great work. Iāll definitely try to workaround. Thank you _
š