DEV Community

Sam Der
Sam Der

Posted on

Getting Started With Local LLMs Using AnythingLLM

In this tutorial, AnythingLLM will be used to load and ask questions to a model. AnythingLLM provides a desktop interface to allow users to send queries to a variety of different models.

Note: At the time of writing, AnythingLLM recommends that user machines have at least 2 GB of RAM, a 2-core CPU, and 5 GB of storage.

Navigate to the homepage, download the desktop application for your device, and follow the installation instructions.

Selecting an LLM. This should appear upon first startup.

At this point, you can choose from any model that's supported. The first selection, AnythingLLM, contains a collection of open source models, including Llama3.2 and Gemma, that do not require any additional setup and are free to use. The remaining selections allow you to connect AnythingLLM to external, third-party providers such as OpenAI and HuggingFace, through API keys or self-hosted endpoints. For this tutorial, we'll choose AnythingLLM's Gemma 3 1 billion parameter model.

After proceeding, the model will begin downloading in the background. In the meantime, you can play around with your settings and create a workspace. AnythingLLM allocates workspaces where you can create chat threads, leverage different models, and configure custom settings for each one. For example, it is entirely possible (and even encouraged!) to create a new workspace that uses Llama 3.2 with a LLM temperature of 0.1.

AnythingLLM homepage. Lots of customizations are available!

Once your model is downloaded, you can begin asking it questions:

Asking a basic question to Gemma

You can also attach files to the workspace and prompt the model about them for RAG-based capabilities. For example, I uploaded a Python file containing the code below and asked Gemma what it does:

def hello_world():
    return "Hello world"

if __name__ == "__main__":
    hello_world()
Enter fullscreen mode Exit fullscreen mode

Uploading files to the workspace

Prompting Gemma about the above Python script

As expected, it is able to parse through the code and correctly outputs what does.

Top comments (0)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.