DEV Community

Abderrahmene Merzoug
Abderrahmene Merzoug

Posted on

2 1 1 1 1

Transforming GPT through API into a Fully Functional AI Assistant with LLMKit: A Step-by-Step Guide

In recent years, large language models (LLMs) such as GPT-3 have shown incredible promise in generating human-like text, answering questions, and holding conversations. However, to transform an LLM into a fully functional AI assistant, we need more than just text generation capabilities. That's where LLMKit comes in. LLMKit is a powerful library designed to help developers convert their text-to-text LLMs into fully functional AI assistants, enabling them to perform specific tasks and handle real-world scenarios.

LLMKit simplifies the integration of LLMs into your applications by providing a modular plugin system, built-in retry mechanisms, and customizable conversation configurations. In this article, we will walk you through the process of creating a conversation with an external function call using LLMKit. By the end of this guide, you'll have a solid understanding of how to extend the capabilities of your LLM and build a responsive AI assistant.

Getting Started with LLMKit

To start using LLMKit, you'll need to install it from GitHub using npm. Open your terminal and run the following command:

npm i obaydmerz/llmkit
Enter fullscreen mode Exit fullscreen mode

A step-by-step guide to create a conversation

Step 1: Importing modules and creating instances

import { Conversation, retry } from "llmkit";
import { External, ExternalFunction } from "llmkit/plugins/external";
// Import the OpenAI module here
Enter fullscreen mode Exit fullscreen mode

also, instantiate the OpenAI client:

const gpt = new OpenAI_Instance_Or_Any_LLM();
Enter fullscreen mode Exit fullscreen mode

Step 2: Create the conversation

Don't worry, we will explain the code below later:

let conv = Conversation.create(
    (m) => retry({}, (num) => gpt.chatCompletion(/*or any function*/(m, {
      debug: true
    }))),
    {
      plugins: [
        External.create([
          ExternalFunction.create("purchase", {
            description: "Orders something",
            parameters: {
              name: "string, The name of the product"
            },
            hook: async ({ name }) => {
              if (name.toLowerCase() == "pizza") {
                return "Ordered! Tell the user that the pizza is yummy";
              }
              return "Product Not Found";
            },
          }),
        ]),
      ],
    }
  );
Enter fullscreen mode Exit fullscreen mode

Conversation is the main class in LLMKit that hold the conversation between three parts: system, user, agent

The Conversation.create() static function accepts two arguments:

  • The first argument, which is the function that's called to pass the string message to the GPT. ( notice the retry function which repeats the function if it fails )
  • The second argument is the options object.

Here we added External to plugins, External provides a way for the GPT/LLM to execute functions your code.

Step 3: Send a Message to the Conversation

(async () => {
     await conv.start();
     let res = await conv.send("I wanna purchase a burger");

     // You can access messages through conv.messages
     console.log(res); // I'm sorry, I couldn't find the burger you were looking for. How can I assist you further?
})();
Enter fullscreen mode Exit fullscreen mode

After that all

If you still have any question, post it in the comments.
Join our discord server for further assistance!

🚀🚀🚀🚀🚀🚀🚀🚀🚀🚀🚀🚀🚀🚀

SurveyJS custom survey software

JavaScript UI Libraries for Surveys and Forms

SurveyJS lets you build a JSON-based form management system that integrates with any backend, giving you full control over your data and no user limits. Includes support for custom question types, skip logic, integrated CCS editor, PDF export, real-time analytics & more.

Learn more

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay