DEV Community

Cover image for Docker Can Run LLMs Locally. Wait, What!?

Docker Can Run LLMs Locally. Wait, What!?

Pradumna Saraf on April 07, 2025

Using Docker to run Large Language Models (LLMs) locally? Yes, you heard that right. Docker is now much more than just running a container image. W...
Collapse
 
abbas_khan_86c91d831461c6 profile image
Abbas Khan

How is it better than using LLM Studio?

Collapse
 
pradumnasaraf profile image
Pradumna Saraf

Performance-wise, they both wrap llama.ccp (so we should expect similar performances), but LM Studio is more mature with multiple engines support and more OS support so far. The Model Runner will get there, but it will require some time.

The integration with the Docker technology (compose but also the Docker Engine for pipelines) is going to come soon, and it will give the model runner an advantage to developers to be well integrated in their development lifecycle.

Collapse
 
rebasemedia profile image
rebase media

🚀

Collapse
 
pradumnasaraf profile image
Pradumna Saraf

Let's go!

 
cmelgarejo profile image
Christian Melgarejo • Edited

I would say this is true for a low managed machine, or a box that you will just used to expose LLMs, where installing ollama or llmstudio and maintaining the latest versions are not possible.

Collapse
 
ezodude profile image
Ezo Saleh

Back in the day, Docker did the same thing with Kubernetes. You could just run K8s right there in Docker Desktop - it worked. Very handy for beginners.

I guess it's the same in this case too. 😄

Collapse
 
pradumnasaraf profile image
Pradumna Saraf

100%. Thank you.

Collapse
 
mikegcoleman profile image
Mike Coleman

I'm not sure Ollama will give better perf since both Ollama and Docker Model Runner are running natively on the hardware.

Collapse
 
pradumnasaraf profile image
Pradumna Saraf

The performances are very similar (of course). One of our captains compared both in a blog, you can check it out: connect.hyland.com/t5/alfresco-blo....

Thanks for the comment!