DEV Community

Cover image for Build Your Own Copilot: Adding Local Context in to Your Conversation
Pieces 🌟 for Pieces.app

Posted on • Updated on • Originally published at code.pieces.app

Build Your Own Copilot: Adding Local Context in to Your Conversation

Adding Local Context to your Copilot.

Pieces OS Client is a flexible database with a long list of APIs that can be used to build your own copilot that has the ability to understand local context and use it in combination with on-device Local LLMs to answer questions and assist developers in their coding activities. In previous articles, we have covered how you can create your own Pieces Copilot and utilize all of the features available in the Typescript SDK available on NPM in order to add this functionality to your own applications and projects.

We also have provided SDKs in Python and Kotlin, and are working hard to bring you a Dart SDK to provide support across a multitude of languages and environments. Each of these articles contains examples on how you can create a Pieces Copilot that fits your needs.

Code Along

Following the previous article and the addition of the local Llama2 (GPU/CPU) Models, we want to show the power of using these models on the tasks for which we previously used GPT4 (or 3.5). We also want to demonstrate what we can accomplish by adding context to get specific results back from our copilot—even in an offline environment.

Each article in this series corresponds to this repository, which contains a starter project for building your own copilot application in a Vanilla JavaScript environment. Use the repo to follow along with this article or to use as a starting point for your own project.

Prerequisites

In order to understand this article, it’s best to first read the other two articles on building your own copilot to understand aspects of the LLLMs and sending conversation messages.

If you have missed the other two preceding articles you can catch up here:

For following along in your own project, be sure to have Pieces OS downloaded and the appropriate SDK for your language or environment.

Getting Started With Context

In our previous work we have created a few simple HTML elements in order to put other displayed information such as the copilot message that you are going to attach as a query when you send a conversation message. Here we need to create a button to add the files to our context, a label (needed for targeting the text area), and a container that will display the selected files for context. You will see these again later as we populate them:

<button id='add-files-as-context'>Add file(s) to context</button>
<textarea id="context-input"></textarea>
<label for="context-input"></label>

<div id="context-files-added-container"></div>
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

We shouldn't need to revisit this file at all, but can move over to index.ts and start learning how to create our RelevanceRequest. First, we’ll fetch our files that are selected as context and their associated file paths, then we’ll send them over to our CopilotStreamController.ts to craft the request for our conversation message with the copilot.

Capturing Relevant Files as Context

When the add-files-as-context button is pressed, we can use a Pieces OS Client API to open a File Picker native to your operating system. You can do that using a really simple API call using OSApi and you’ll notice the empty filePickerInput: {} object that is passed in; this is something you only have to do with the SDK.

This returns a specific type of data, and when it completes (or the files are selected), it will return the files as an array of strings which will be used in the next step when using .then():

// opening and returning a file(s) back when it closes.
new OSApi().pickFiles({filePickerInput: {}})
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

We can attach this action to our button up above once we use document.getElementByID("add-files-as-context") and ensure the button is present on the DOM:

const addFilesAsContext = document.getElementById("add-files-as-context");
    if (!addFilesAsContext) throw new Error('expected id add-files-as-context');

 // add the onclick function to our add context button, which will open up the file picker 
 // and then return back an array of files, which we can iterrate over.
    addFilesAsContext.onclick = () => {
        new Pieces.OSApi().pickFiles({filePickerInput: {}}).then((files) => {
            files.forEach((file) => {

                ...
            })
        })
    }
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Then on each file path that is returned, each file (list of strings that is returned from the file picker, each being a string that is the file path of the selected file) that is selected in the FilePicker is returned back individually on the Array. We go through each one and can create a new child inside of our list of selected files that is displayed on the UI, along with updating our selectedContextFiles global variable that is read elsewhere:

// Retrieve the element to add files as context and validate its existence.
const addFilesAsContext = document.getElementById("add-files-as-context");
if (!addFilesAsContext) throw new Error('Element with ID "add-files-as-context" not found.');

// Set an onclick handler to open the file picker and process selected files.
addFilesAsContext.onclick = async () => {
try {
    // Use OSApi to open a file picker and handle the selection.
    const files: string[] = await new OSApi().pickFiles({ filePickerInput: {} });

    files.forEach((file: string) => {
        // Find the container for file entries and validate its existence.
        const contextContainer = document.getElementById('context-files-added-container');
        if (!contextContainer) throw new Error('Element with ID "context-files-added-container" not found.');

       // Create a new paragraph for each file and append it to the container.
       const newFileEntry = document.createElement("p");
       newFileEntry.innerText = file;
       contextContainer.appendChild(newFileEntry);

       // Update the global list of selected files in CopilotStreamController.
       CopilotStreamController.selectedContextFiles.push(file);
    }); 
} catch (error) {
  console.error('Error occurred:', error);
  }
};
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Creating Relevance and Adding User Context

Now that our global value has been updated for selectedContextFiles we can head over to CopilotStreamController.ts to use those values and formulate our relevance request. The Pieces.RelevanceRequest object is the first step in a process to gather, seed, then attach relevance to your conversation message that you send.

Creating the Relevance.Request

When we created the FilePicker functionality earlier by using the OSApi.pickFiles() endpoint, we captured each of the FilePaths (as strings) associated with files requested to be added as context by the user and stored them on the CopilotStreamController.selectedContextFiles variable that we can now pass in with our query:

const relevanceInput: Pieces.RelevanceRequest = {
      qGPTRelevanceInput: {
        query,
        paths: CopilotStreamController.selectedContextFiles,
      }
    }

// for our proceeding steps we can also add in this arror handling 
if (!(relevanceInput.qGPTRelevanceInput.query ?? ''))
throw new Error('Your Query is empty or it was not provided');
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Remember that query is just the string value that comes from our userInput text area from the earlier articles.

Connecting Your Application

In order to seed the qGPT.RelevanceInput.seed that is needed to use the qGPTApi().relevance endpoint, you will need to connect your application to Pieces OS and effectively authenticate as a registered application.

Below where the relevanceInput is double-checked and error handled in our previous step, we can add these two lines:

const application = await getApplication();
if (!application) throw new Error('you must have a registered application to use this, is Pieces os running?')
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Now over in the index.ts file—or in your corresponding entrypoint.ts file—there needs to be a new function defined for creating and getting our application value after communicating with Pieces OS. There are some notes on the different values here, but this is a copy and paste example of getting a generic application value for an OpenSource project:

let application: Application;
export async function getApplication() {
    if (application) return application;

    // PlatformEnum corresponds to the current operating system that this is being run on.
    // a great example of using a one line conditional to select the proper platform enum.
    const platform: PlatformEnum = window.navigator.userAgent.toLowerCase().includes('linux') ? PlatformEnum.Linux : window.navigator.userAgent.toLowerCase().includes('win') ? PlatformEnum.Windows : PlatformEnum.Macos;

    // Creating the Application Here, and setting up the three primary parameters.
    //
    // name: which uses the ApplicationNameEnum. there are some other useful values like .Unknown
    // version: just can pass in a string and does not affect anything but can be used.
    // platform is passed in to the platform parameter
    let context: Context = await new ConnectorApi().connect({
        seededConnectorConnection: {
            application: {
                name: ApplicationNameEnum.OpenSource,
                version: '0.0.0',
                platform,
            }
        }
    });

// set our application equal to our context.application, that is return from the connect endpoint
application = context.application

}
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

The application has been created and now we can get the application parameter added into our relevanceInput.qGPTRelevanceInput.seeds to get back our relevance.

Building the relevanceInput.qGPTRelevanceInput.seeds

When a relevance call is made via the API, it needs a pre-seeded object that contains the application, the parameter type that represents the type of seed that is going to be used, and the userContextInput that is inside of our contextInput. If there is no userContextInput, then we want to be sure that we don't run this, as it would add additional processing with no supplied relevance. Here is the object in full:

// ensure that userContextInput is here
if (userContextInput) {

// define the seeds here on the relevance endInput.
      relevanceInput.qGPTRelevanceInput.seeds = {
        iterable: [
          {
            // the type of relevance input that is being used.
            type: SeedTypeEnum.Asset,
            asset: {
              // the application we created and regisetered.
              application,
              format: {
                fragment: {
                  string: {
                    // the user input that was supplied passed in as a raw string value.
                    raw: userContextInput
                  }
                }
              }
            }
          }
        ]
      }
    }
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Beneath that, we can then call the qGPTApi.relevance() endpoint to get our relevance back once we pass in the relevanceInput variable and store its output:

const relevanceOutput = await new QGPTApi().relevance(relevanceInput);
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Finally, we pass in the relevanceOutput into the relevant parameter on the Pieces.QGPTStreamInput object. You may recognize it from when we initially created our question during the first article of this series. Once all of the framing is configured throughout the rest of the project, the final call to ask the question becomes quite simple:

const input: Pieces.QGPTStreamInput = {
      question: {
        // pass in your query here as normal.
        query,

        // replace this empty iterable with the new relevance.
        // relevant: {iterable: []},
        relevant: relevanceOutput.relevant,
        model: CopilotStreamController.selectedModelId
      },
    };
Enter fullscreen mode Exit fullscreen mode

Save this Snippet

Seeing Relevance/Context in Action

With all parts added, the context functionality will give you the ability to ask questions based on the information that is provided to the copilot. If you are following along and have the repo cloned on your machine and are up and running with the copilot project, you can run your application and test this out on your machine.

Returning to the browser with your copilot, you can attach any file and ask a specific question about information in that file to feel the effects and see the difference between requests. Try adding a JSON object as a file to your context and ask the copilot questions to see the results, or try it out with the copilot example project.

Ready to Build Your Own Copilot?

Now you have a complete guide to building your own copilot, downloading and using local LLMs, and adding context to your copilot conversation messages. With these tools, you can build your own copilot application or add the functionality of Pieces OS Client into your own applications.

If you would like to get more involved with this project, you can check out our OpenSource Repo or this project on Github to download the complete code.

Resources:

More articles coming soon around how to use the SDKs and other projects we are working on!

Top comments (0)