DEV Community

Cover image for How to Run Large Language Models Locally on a Windows Machine Using WSL and Ollama
0xkoji
0xkoji

Posted on

9 1

How to Run Large Language Models Locally on a Windows Machine Using WSL and Ollama

prerequisite

Install WSL

https://learn.microsoft.com/en-us/windows/wsl/install?source=post_page-----9d53151e254a--------------------------------

Install curl

sudo apt install curl
Enter fullscreen mode Exit fullscreen mode

Install Ollama via curl

curl https://ollama.ai/install.sh | sh
Enter fullscreen mode Exit fullscreen mode

Run Ollama

In this case, we will try to run Mistral-7B.
If you want to try another model, you can pick from the following site.
https://ollama.ai/library

ollama serve
Enter fullscreen mode Exit fullscreen mode

Open another Terminal tab and run the following command. The following command will pull a model.

ollama run mistral
Enter fullscreen mode Exit fullscreen mode

If everything works properly, you will see something like below.
My machine has a GPU, RTX3070. So Ollama is using the GPU.

Terminate Ollama

If you want to exit Ollama, you need to type the following.

/exit
Enter fullscreen mode Exit fullscreen mode

Then ctrl + c in a terminal what you ran ollama serve.

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more

Top comments (3)

Collapse
 
chuangtc profile image
Jason TC Chuang

No need WSL, Ollama runs natively on Windows machine beginning v.0.1.27.
jasonchuang.substack.com/p/ollama-...

Collapse
 
0xkoji profile image
0xkoji

yeah, but I think using WSL makes everything for devs.

Collapse
 
m0n0x41d profile image
Ivan Zakutnii

Good job! Thanks for the article. But I just can't control myself. Sorry...

Image description

Sentry image

See why 4M developers consider Sentry, “not bad.”

Fixing code doesn’t have to be the worst part of your day. Learn how Sentry can help.

Learn more

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay