DEV Community

Cover image for Make GitHub Copilot work with any LLM models
Jijun
Jijun

Posted on

Make GitHub Copilot work with any LLM models

It is a proxy server that forwards Copilot requests to OpenAI API compatible LLM endpoints. You can find the proxy server and instructions here: https://github.com/jjleng/copilot-proxy. Only briefly tested, bugs might exist.

My Motivations of building the tool

  • I'm already familiar with and enjoy using the GitHub Copilot extension (yes, I know there are other awesome extensions, such as Continue.).
  • Copilot may not always utilize the latest GPT models. It currently use models like gpt-4-0125-preview, gpt-3.5-turbo and others.
  • Transferring code from the editor to ChatGPT to use GPT-4o is inconvenient.
  • I'm interested in using alternative models such as Llama3, DeepSeek-Coder, StarCoder, and Sonnet 3.5.
  • I have subscriptions to both ChatGPT and Copilot but would like to cancel my Copilot subscription.

Top comments (0)