DEV Community

Vijay Kumar Kodam
Vijay Kumar Kodam

Posted on • Edited on • Originally published at vijay.eu

Running Deepseek R1 locally

Here is a video of Deepseek R1 running locally on my Macbook.
Now that everyone is amazed by the low level of resource utilization and open weight model of Deepseek R1, installed 8B model locally.

With the help of ollama it is as simple as running a single command -
"ollama run deepseek-r1:8b"
You get your own private, local LLM running securely in your computer.

My prompt is "Generate a five line fairy tale about AI?". In the video, you can see how before generating the response, it thinks. Thinking text is in between and tags.

Watch the video to read the five line fairy tale about AI.

Do you have any experience running Deepseek R1?

Top comments (1)

Collapse
 
dev_guy profile image
James Ferguson

Thanks for sharing this!