DEV Community

Cover image for I made OpenAPI and LLM schema definitions
Jeongho Nam
Jeongho Nam

Posted on

I made OpenAPI and LLM schema definitions

Preface

OpenAPI conversion diagram

https://github.com/samchon/openapi

OpenAPI and LLM schema definitions for TypeScript.

@samchon/openapi is a collection of OpenAPI types for every versions, and converters for them. In the OpenAPI types, there is an "emended" OpenAPI v3.1 specification, which has removed ambiguous and duplicated expressions for the clarity. Every conversions are based on the emended OpenAPI v3.1 specification.

  1. Swagger v2.0
  2. OpenAPI v3.0
  3. OpenAPI v3.1
  4. OpenAPI v3.1 emended

@samchon/openapi also provides LLM (Large Language Model) function calling application composer from the OpenAPI document with many strategies. With the HttpLlm module, you can perform the LLM funtion calling extremely easily just by delivering the OpenAPI (Swagger) document.

LLM Function Calling and Structured Output

LLM selects proper function and fill arguments.

In nowadays, most LLM (Large Language Model) like OpenAI are supporting "function calling" feature. The "LLM function calling" means that LLM automatically selects a proper function and fills parameter values from conversation with the user (may by chatting text).

Structured output is another feature of LLM. The "structured output" means that LLM automatically transforms the output conversation into a structured data format like JSON.

OpenAPI Definitions

OpenAPI versions

@samchon/openapi support every versions of OpenAPI specifications with detailed TypeScript types.

Also, @samchon/openapi provides "emended OpenAPI v3.1 definition" which has removed ambiguous and duplicated expressions for clarity. It has emended original OpenAPI v3.1 specification like above. You can compose the "emended OpenAPI v3.1 document" by calling the OpenApi.convert() function.

  • Operation
    • Merge OpenApiV3_1.IPathItem.parameters to OpenApi.IOperation.parameters
    • Resolve references of OpenApiV3_1.IOperation members
    • Escape references of OpenApiV3_1.IComponents.examples
  • JSON Schema
    • Decompose mixed type: OpenApiV3_1.IJsonSchema.IMixed
    • Resolve nullable property: OpenApiV3_1.IJsonSchema.__ISignificant.nullable
    • Array type utilizes only single OpenAPI.IJsonSchema.IArray.items
    • Tuple type utilizes only OpenApi.IJsonSchema.ITuple.prefixItems
    • Merge OpenApiV3_1.IJsonSchema.IAnyOf to OpenApi.IJsonSchema.IOneOf
    • Merge OpenApiV3_1.IJsonSchema.IRecursiveReference to OpenApi.IJsonSchema.IReference
    • Merge OpenApiV3_1.IJsonSchema.IAllOf to OpenApi.IJsonSchema.IObject

Conversions to another version's OpenAPI document is also based on the "emended OpenAPI v3.1 specification" like above diagram. You can do it through OpenApi.downgrade() function. Therefore, if you want to convert Swagger v2.0 document to OpenAPI v3.0 document, you have to call two functions; OpenApi.convert() and then OpenApi.downgrade().

At last, if you utilize typia library with @samchon/openapi types, you can validate whether your OpenAPI document is following the standard specification or not. Just visit one of below playground links, and paste your OpenAPI document URL address. This validation strategy would be superior than any other OpenAPI validator libraries.

import { OpenApi, OpenApiV3, OpenApiV3_1, SwaggerV2 } from "@samchon/openapi";
import typia from "typia";

const main = async (): Promise<void> => {
  // GET YOUR OPENAPI DOCUMENT
  const response: Response = await fetch(
    "https://raw.githubusercontent.com/samchon/openapi/master/examples/v3.0/openai.json"
  );
  const document: any = await response.json();

  // TYPE VALIDATION
  const result = typia.validate<
    | OpenApiV3_1.IDocument
    | OpenApiV3.IDocument
    | SwaggerV2.IDocument
  >(document);
  if (result.success === false) {
    console.error(result.errors);
    return;
  }

  // CONVERT TO EMENDED
  const emended: OpenApi.IDocument = OpenApi.convert(document);
  console.info(emended);
};
main().catch(console.error);
Enter fullscreen mode Exit fullscreen mode

LLM Function Calling

OpenAPI conversion diagram

LLM function calling application from OpenAPI document.

@samchon/openapi provides LLM (Large Language Model) funtion calling application from the "emended OpenAPI v3.1 document". Therefore, if you have any HTTP backend server and succeeded to build an OpenAPI document, you can easily make the A.I. chatbot application.

In the A.I. chatbot, LLM will select proper function to remotely call from the conversations with user, and fill arguments of the function automatically. If you actually execute the function call through the HttpLlm.execute() funtion, it is the "LLM function call."

Here is the example code executing the LLM function call with @samchon/openapi.

import {
  HttpLlm,
  IChatGptSchema,
  IHttpLlmApplication,
  IHttpLlmFunction,
  OpenApi,
  OpenApiV3,
  OpenApiV3_1,
  SwaggerV2,
} from "@samchon/openapi";
import OpenAI from "openai";
import typia from "typia";

const main = async (): Promise<void> => {
  // Read swagger document and validate it
  const swagger:
    | SwaggerV2.IDocument
    | OpenApiV3.IDocument
    | OpenApiV3_1.IDocument = JSON.parse(
    await fetch(
      "https://github.com/samchon/shopping-backend/blob/master/packages/api/swagger.json",
    ).then((r) => r.json()),
  );
  typia.assert(swagger); // recommended

  // convert to emended OpenAPI document,
  // and compose LLM function calling application
  const document: OpenApi.IDocument = OpenApi.convert(swagger);
  const application: IHttpLlmApplication<"chatgpt"> = HttpLlm.application({
    model: "chatgpt",
    document,
  });

  // Let's imagine that LLM has selected a function to call
  const func: IHttpLlmFunction<"chatgpt"> | undefined =
    application.functions.find(
      // (f) => f.name === "llm_selected_fuction_name"
      (f) => f.path === "/shoppings/sellers/sale" && f.method === "post",
    );
  if (func === undefined) throw new Error("No matched function exists.");

  // Get arguments by ChatGPT function calling
  const client: OpenAI = new OpenAI({
    apiKey: "<YOUR_OPENAI_API_KEY>",
  });
  const completion: OpenAI.ChatCompletion =
    await client.chat.completions.create({
      model: "gpt-4o",
      messages: [
        {
          role: "assistant",
          content:
            "You are a helpful customer support assistant. Use the supplied tools to assist the user.",
        },
        {
          role: "user",
          content: "<DESCRIPTION ABOUT THE SALE>",
          // https://github.com/samchon/openapi/blob/master/examples/function-calling/prompts/microsoft-surface-pro-9.md
        },
      ],
      tools: [
        {
          type: "function",
          function: {
            name: func.name,
            description: func.description,
            parameters: func.parameters as Record<string, any>,
          },
        },
      ],
    });
  const toolCall: OpenAI.ChatCompletionMessageToolCall =
    completion.choices[0].message.tool_calls![0];

  // Actual execution by yourself
  const article = await HttpLlm.execute({
    connection: {
      host: "http://localhost:37001",
    },
    application,
    function: func,
    input: JSON.parse(toolCall.function.arguments),
  });
  console.log("article", article);
};
main().catch(console.error);
Enter fullscreen mode Exit fullscreen mode

Next Episode: from TypeScript Type

import { ILlmApplication, ILlmFunction, ILlmSchema } from "@samchon/openapi";
import typia from "typia";

// FUNCTION CALLING APPLICATION SCHEMA
const app: ILlmApplication<"chatgpt"> = typia.llm.application<
  BbsArticleController,
  "chatgpt"
>();
const func: ILlmFunction<"chatgpt"> | undefined = app.functions.find(
  (f) => f.name === "create",
);

console.log(app);
console.log(func);

// STRUCTURED OUTPUT
const params: ILlmSchema.IParameters<"chatgpt"> = typia.llm.parameters<
  IBbsArticle.ICreate,
  "chatgpt"
>();
console.log(params);
Enter fullscreen mode Exit fullscreen mode

💻 Playground Link

Just by the TypeScript type.

You also can compose LLM function calling application schema from TypeScript class or interface type. Also, you can create structured output schema from the native TypeScript type, too.

At the next article, I'll show you how to utilize typia's LLM function calling schema composer, and how to integrate it with the A.I. chatbot application.

Top comments (0)