DEV Community

ugurgunes95
ugurgunes95

Posted on

6 1

How Can We Use DeepSeek R1 LLM On Our Local Machine?

  • In this article I'm going to cover how we can use the DeepSeek R1 Large Language Model on our local machine.
  • We will be using the Ollama command line tool to interact with the model.
  • We also will be using Anything LLM for better user experience.

Prerequisites

  • Make sure that you have Ollama installed on your local machine.
  • Make sure that you have Anything LLM installed on your local machine.

Installation

Once you've downloaded those links listed above;

  • Install Ollama.
  • Install Anything LLM.
  • These steps are pretty straightforward and should be done in a few minutes.

Downloading The Model

  • If you have installed Ollama correctly you just need to open a terminal and type the following command:
  • ollama run deepseek-r1:7b Downloading process
  • This will download the model and start it on your local machine.
  • Once the model is downloaded and started, you can interact with it on terminal. Conversation example

Adding The Model To Anything LLM

  • As you can see you are already able to use the model from Ollama command line tool.
  • But if you want a better user experience, you can add the model to Anything LLM. In this section I'll try to explain this.
  • Open Anything LLM and click New Workspace button. New workspace
  • This will create a new workspace and you will be able to customize it.
  • Then you need to click this little gear icon on the right of the workspace name. Gear Icon
  • From that screen, you should open Chat Settings and at the of this seciton you are going to see Workspace LLM Provider.
  • Once you have selected Ollama from that list you will see your downloaded model listed there.
  • You can then select it as your default model for the workspace and click update workspace at the end of the page.
  • Since this is not a how to guide for Anything LLM, I will leave you with this.
  • Now you can interact with your model in a more user-friendly way as you can see below.

Anything llm

Conclusion

In this section, we have learned how to add a downloaded model to Anything LLM and use it for better user experience. We have also seen how to customize the workspace and select the default model for the workspace.

Speedy emails, satisfied customers

Postmark Image

Are delayed transactional emails costing you user satisfaction? Postmark delivers your emails almost instantly, keeping your customers happy and connected.

Sign up

Top comments (1)

Collapse
 
ahmeturganci profile image
Ahmet

Great, thank you