
DeepSeek-R1 has been creating quite a buzz in the AI community. Developed by a Chinese AI company DeepSeek, this model is being compared to OpenAI'...
For further actions, you may consider blocking this person and/or reporting abuse
What is the minimum Requirements of Hardware to run this?
As you can see when you go to Llama website, you can run the different parameters of DeepSeek-R1. You can find the details of requirements here: ollama.com/library/deepseek-r1
You can run 1.5b, 7b, 8b, 14b, 32b, 70b, 671b and obviously the hardware requirements increase as you choose bigger parameter. I used 7b one in the above tutorial.
You literally need a tank for running 671b.
70b and smaller ones are possible at home. You can run it on CPU or on GPU, whicht is a lot faster. A cool one would be an NVIDIA 3090, you can rebuy it for around 700 euros atm.
Maybe be clearer about that in the article. It is deceiving to not specifically say what model you are running. "Running R1 locally" sounds like you are running their best OpenAI o1 equivalent locally, which is not the case.
It is the same but with less parameter one. Done. Updated the article with 7b.
Good one, it helped me a lot.
Say hello to DeepSeek R1—the AI-powered platform that’s changing the rules of data analytics!
🔍 What makes DeepSeek R1 a game-changer?
✅ Real-time data processing for instant insights
✅ Advanced machine learning & NLP capabilities
✅ Scalable, secure, and user-friendly
✅ Perfect for industries like healthcare, finance, e-commerce, and more
Whether you're a data scientist, business leader, or tech enthusiast, DeepSeek R1 is your ultimate tool to unlock the true potential of your data.
📖 Want to learn more? Dive into the full blog to discover how DeepSeek R1 can transform your business:
👉 myappranking.com/deepseek-r1
Thanks for the summary.
Well done. Simple and straight to the point. Thank you for contributing and helping those of us that really need guidance. I'm running an Intel(R) Core(TM) i7-8565U CPU @ 1.80GHz 1.99 GHz, with 16.0 GB (15.8 GB usable), and I downloaded the deepseek-r1:32b.
Wish me luck.
is there GUi for local version?
yeah there is one ! you can pair Ollama with Open WebUI – a graphical user interface (GUI) tool
just go to docs.openwebui.com
No idea, need to check.
MSTY.app is a great one and my favorite. LMStudio is nice as well.
What happened at tinanimun Square
youtube.com/watch?v=2Oq2k066A1w
the local one can actually answer this question i just tested it
To be fair... As the other AIs what think of Donal trump
how to download it to a specific drive and not the c drive?
You can move it around wherever you want.
Hi
سلام
My name is Eugénio Correia
Hola eugenio correia
I asked who is the current US president and the answers is Joe Biden.
Maybe that is why it is free.
Is there a way to get the API key when running it locally?