DEV Community

Cover image for How to connect a local AI model(with Ollama) to VS Code. (Updated)
Sushan
Sushan

Posted on • Edited on

How to connect a local AI model(with Ollama) to VS Code. (Updated)

You can try out the latest Ollama models on VS Code for free.

We are using Ollama, which is a free local AI model running application developed by the Llama community.
​​

Installing and using Ollama

You can download Ollama from its website.
​​

Now you'll be able to access ollama using your terminal.

  1. Open your terminal.
  2. Type ollama to verify if it's been installed.
    Ollama run

  3. Run a model you like(depending on your hardware), using the command:
    ollama run qwen3:4b
    This command will pull and run the model.
    Model Running

  4. Change the model name to your preferred model and install it.
    To view all the available models, go to ollama.com/search

  5. If you want to run a high-end AI model, you can use Ollama Cloud for free.
    Run them like this: ollama pull qwen3-coder:480b-cloud.
    ​ㅤ

Integrating with VS Code

Make sure the Ollama server is running in the background.
Verification: Check this URL, localhost:11434, and see if Ollama is running.
If not: Run it using the command ollama serve.

Ollama serve command

Ollama run check

  1. Open VS Code -> Copilot chat sidebar.
  2. Select the model dropdown -> Manage models -> Add Models(Top right) -> Select Ollama -> Hit Enter ("Ollama" prewritten) -> Type 'http://localhost:11434/' and Enter -> Select the desired models.

Manage Models Option

  1. Ollama will be added to the list like this.
  2. Make sure to make the model visible by clicking the eye 👁️ sign.

Configuring Ollama

Demo:

  1. If you still don't see your models in the models drop-down in the chatbar, make sure you're in Ask mode.

Ask Mode in Ollama

Ollama Models

The option will disappear once you turn the Ollama server off.

And if it still doesn't work:

Get a free Gemini API key from https://aistudio.google.com/api-keys
Select Google model, add API key, and DONE.

Top comments (4)

Collapse
 
francisco_susana profile image
Francisco

looks super easy but in the Version: 1.108.1 (Universal), these steps are not working for me (although Ollama is running on my pc).

Collapse
 
sushan profile image
Sushan

I've updated the article

Collapse
 
elias_zerano_faa9afb5b19b profile image
elias zerano

the models are installed, and ollama serve is running but I still get no opening in the dropdown ?

Collapse
 
sushan profile image
Sushan

Checkout the updated article.