DEV Community

Cover image for Get started with GitHub Copilot SDK
Chris Noring for Microsoft Azure

Posted on

Get started with GitHub Copilot SDK

Did you know GitHub Copilot now has an SDK and that you can leverage your existing license to build AI integrations into your app? No, well I hope I have you attention now.

Install

You need two pieces here to get started:

  • GitHub Copilot CLI
  • A supported runtime, which at present means either Node.js, .NET, Python or Go

Then you need to install the SDK for your chosen runtime like so:

pip install github-copilot-sdk
Enter fullscreen mode Exit fullscreen mode

The parts

So what do you need to know to get started? There are three concepts:

  • Client, you need to create and instance of it. Additionally you need to start and stop it when you're done with it.
  • Session. The session takes an object where you can set things like model, system prompt and more. Also, the session is what you talk when you want to carry out a request.
  • Response. The response contains your LLM response.

Below is an example program using these three concepts. As you can see we choose "gpt-4.1" as model but this can be changed. See also how we pass the prompt to the function send_and_wait.

import asyncio
from copilot import CopilotClient

async def main():
    client = CopilotClient()
    await client.start()

    session = await client.create_session({"model": "gpt-4.1"})
    response = await session.send_and_wait({"prompt": "What is 2 + 2?"})

    print(response.data.content)

    await client.stop()

asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Ok, now that we know what a simple program looks like, let's make something interesting, an FAQ responder.

Your first app

An FAQ for a web page, is often a pretty boring read. A way to make that more interesting for the end user is if they can instead chat with the FAQ, let's make that happen.

Here's the plan:

  • Define a static FAQ
  • Add the FAQ as part of the prompt.
  • Make a request to to the LLM and print out the response.

Let's build out the code little by little. First, let's define the FAQ information.

-1- FAQ information

# faq.py

faq = {
  "warranty": "Our products come with a 1-year warranty covering manufacturing defects. Please contact our support team for assistance.",
  "return_policy": "We offer a 30-day return policy for unused products in their original packaging. To initiate a return, please visit our returns page and follow the instructions.",     
  "shipping": "We offer free standard shipping on all orders over $50. Expedited shipping options are available at checkout for an additional fee.",
}
Enter fullscreen mode Exit fullscreen mode

Next, let's add the call to the Copilot SDK

-2 Adding the LLM call


import asyncio
from copilot import CopilotClient

def faq_to_string(faq: dict) -> str:
    return "\n".join([f"{key}: {value}" for key, value in faq.items()])

async def main(user_prompt: str = "Tell me about shipping"):
    client = CopilotClient()
    await client.start()

    prompt = f"Here's the FAQ, {faq_to_string(faq)}\n\nUser question: {user_prompt}\nAnswer:"   

    session = await client.create_session({"model": "gpt-4.1"})
    response = await session.send_and_wait({"prompt": prompt})

    print(response.data.content)

    await client.stop()

if __name__ == "__main__":
    print("My first app using the GitHub Copilot SDK!")
    print(f"[LOG] Asking the model about shipping information...")
    asyncio.run(main("Tell me about shipping"))

Enter fullscreen mode Exit fullscreen mode

Note how we concatenate the FAQ data with the user's prompt:

 prompt = f"Here's the FAQ, {faq_to_string(faq)}\n\nUser question: {user_prompt}\nAnswer:"   
Enter fullscreen mode Exit fullscreen mode

-3- Let's run it

Now run it:

uv run faq.py
Enter fullscreen mode Exit fullscreen mode

You should see output like so:

My first app using the GitHub Copilot SDK!
[LOG] Asking the model about shipping information...
We offer free standard shipping on all orders over $50. Expedited shipping options are available at checkout for an additional fee.
Enter fullscreen mode Exit fullscreen mode

What's next

Check out the official docs

Top comments (0)