DEV Community

DEV-AI
DEV-AI

Posted on • Edited on

26

Unlock Local AI Coding Power: Run Deepseek-Coder in VSCode in 60 Seconds

As developers, we’re always chasing tools that boost productivity without compromising security. Enter Deepseek-Coder, a cutting-edge AI model designed to supercharge your coding workflow—locally. No cloud costs, no data leaks, and no latency. Here’s how to set it up in VSCode in under a minute.


Why Deepseek-Coder?

Before diving into the steps, let’s address why this setup is a game-changer:

  • Data Privacy: Your code stays on your machine—no third-party servers.
  • Blazing Speed: Autocomplete suggestions run locally, eliminating lag.
  • Zero Costs: Free, open-source, and no subscription fees.
  • Simplicity: No complex infrastructure—just your IDE and a lightweight model.

If you’re not leveraging local AI models yet, you’re leaving productivity gains on the table. Let’s fix that.


Step 1: Install the Model with Ollama

Ollama is your go-to tool for running open-source AI models locally. It’s lightweight, developer-friendly, and works seamlessly across platforms.

  1. Install Ollama (if you haven’t already):
    • Download it from ollama.com for your OS (Windows, macOS, or Linux).
  2. Pull the Deepseek-Coder Model: Open your terminal and run:
   ollama pull deepseek-coder:base  
Enter fullscreen mode Exit fullscreen mode

This command downloads the Deepseek-Coder base model (~7B parameters), which is optimized for code completion and developer productivity.


Step 2: Integrate with VSCode via CodeGPT

CodeGPT is a powerful VSCode extension that connects local AI models directly to your editor.

  1. Install the CodeGPT Extension:

    • Open VSCode, navigate to Extensions (Ctrl+Shift+X).
    • Search for “CodeGPT” and install the extension Code GPT plugin by codegpt.co where you will find installed deepseek-coder as local LLM.
  2. Start chatting CodeGPT to Use Deepseek-Coder:


Step 3: Enable Autocompletion Magic

Once everything is set up, CodeGPT will start using Deepseek-Coder for context-aware inline suggestions.

  1. Start Typing: Begin writing your code, and the model will generate relevant completions based on the context.
  2. Accept Suggestions: Press Tab to accept a suggestion instantly—no waiting for cloud-based responses.

Pro Tip: Explore the extension’s settings to adjust suggestion frequency, control hotkeys, or customize completion behavior.


Why This Setup Rocks

  1. Privacy First: Your proprietary code never leaves your machine.
  2. Instant Results: Local inference means zero network latency.
  3. Offline-Friendly: Code effectively even without an internet connection.
  4. Cost-Effective: Avoid expensive API costs for code completion and related tasks.

Troubleshooting Tips

  • Model Not Found? Ensure Ollama is running in the background. Start the server with:
  ollama serve  
Enter fullscreen mode Exit fullscreen mode
  • Slow Performance?

    If your machine struggles with the base model, try smaller versions (e.g., deepseek-coder:3b) for better efficiency on lighter hardware.

  • Customization:

    Fine-tune suggestions by modifying settings like temperature and max_tokens in the CodeGPT extension.


Final Thoughts

Local AI models like Deepseek-Coder are redefining how developers approach productivity. By prioritizing privacy, speed, and cost-efficiency, this setup eliminates the trade-offs of cloud-based tools. With just a few steps, you can transform your IDE into a powerhouse of intelligent coding assistance.

Ready to 2X Your Productivity?

Tag your workflow with: #AI #Coding #DeveloperTools #DeepseekCoder #VSCodeHacks

Code smarter, stay secure, and let the machines handle the grunt work. 🚀


Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read more

Top comments (2)

Collapse
 
devaaai profile image
DEV-AI

you need to use Code GPT plugin by codegpt.co where you will find installed deepseek-coder as local llm

Collapse
 
codesama profile image
RichUncleKhalid

This didn't work beacause:

  1. The extension does'nt have the option of 'CodeGPT: Set Model' in the Command Panel.
  2. In VScode extension settings, the options of LLMs to choose from is limited to what is shown in the image below.

Image description

Image of Docusign

🛠️ Bring your solution into Docusign. Reach over 1.6M customers.

Docusign is now extensible. Overcome challenges with disconnected products and inaccessible data by bringing your solutions into Docusign and publishing to 1.6M customers in the App Center.

Learn more

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay