DEV Community

Terry Djony
Terry Djony

Posted on

Built a tool to feed OpenAPI spec into LLM and let them call APIs - MCP is too much a hassle!

I created a tool for LLM to call APIs based on OpenAPI spec (just like how GPT Actions work). It's designed to work with Vercel AI SDK.

This is the tool (works with Vercel AI SDK specification):
openapi-tool.js

export const openapiTool = tool({
  description: `Execute API requests based on OpenAPI specifications. 
  This is a generic tool that can call any API endpoint defined in an OpenAPI spec.
  Extract the base URL, endpoint path, method, and parameters from the OpenAPI specification before using this tool.

  Important: The base URL must be extracted from the OpenAPI spec's 'servers' section or endpoint-specific servers.`,
  inputSchema: z.object({
    baseUrl: z.string().describe('Base URL for the API (extract from OpenAPI spec servers section or endpoint servers)'),
    endpoint: z.string().describe('The API endpoint path (e.g., /v1/forecast, /api/users)'),
    method: z.enum(['GET', 'POST', 'PUT', 'DELETE', 'PATCH']).describe('HTTP method from the OpenAPI spec'),
    queryParams: z.record(z.string(), z.any()).optional().describe('Query parameters as key-value pairs'),
    body: z.record(z.string(), z.any()).optional().describe('Request body for POST/PUT/PATCH requests'),
    headers: z.record(z.string(), z.string()).optional().describe('Additional headers (e.g., API keys, content-type)'),
  }),
  async *execute({ baseUrl, endpoint, method, queryParams, body, headers }) {
    yield { state: 'loading' as const, message: 'Preparing API request...' };

    try {
      // Construct URL with query parameters
      const url = new URL(endpoint, baseUrl);
      if (queryParams) {
        Object.entries(queryParams).forEach(([key, value]) => {
          if (value !== undefined && value !== null) {
            // Handle arrays for query params
            if (Array.isArray(value)) {
              url.searchParams.append(key, value.join(','));
            } else {
              url.searchParams.append(key, String(value));
            }
          }
        });
      }

      yield { 
        state: 'loading' as const, 
        message: `Executing ${method} ${url.toString()}...` 
      };

      // Execute the fetch request
      const defaultHeaders: Record<string, string> = {
        'Content-Type': 'application/json',
      };

      const options: RequestInit = {
        method,
        headers: {
          ...defaultHeaders,
          ...headers, // Allow custom headers to override defaults
        },
      };

      if (body && ['POST', 'PUT', 'PATCH'].includes(method)) {
        options.body = JSON.stringify(body);
      }

      const response = await fetch(url.toString(), options);
      const data = await response.json();

      if (!response.ok) {
        yield {
          state: 'error' as const,
          statusCode: response.status,
          error: data,
          message: `API request failed with status ${response.status}`,
        };
        return;
      }

      yield {
        state: 'success' as const,
        statusCode: response.status,
        data,
        url: url.toString(),
        message: 'API request completed successfully',
      };
    } catch (error) {
      yield {
        state: 'error' as const,
        error: error instanceof Error ? error.message : 'Unknown error',
        message: 'Failed to execute API request',
      };
    }
  },
});
Enter fullscreen mode Exit fullscreen mode

This tool basically retrieves which HTTP request to call and fill in the parameters, based on the context of system prompt and user conversation.
You need to provide the full OpenAPI Spec in the system prompt.

This is an example using Agent in Vercel AI SDK:

export const weatherAgent = new Agent({
  model: openai('gpt-5-mini'),
  system: `You are a helpful assistant with expertise in the Open-Meteo Weather API. You have access to the following OpenAPI specification:

<openapi_specification>
${openapiSpec}
</openapi_specification>

## Your Capabilities:

1. **Answer questions** about the API by referencing the specification above
2. **Execute API requests** using the 'openapi' tool when users want actual data

## How to use the 'openapi' tool:

The openapi tool is a generic tool that can execute any API request. You MUST extract all required information from the OpenAPI spec:

### Step 1: Extract the base URL
- Look in the OpenAPI spec for the 'servers' section (global) or endpoint-specific servers
- For Open-Meteo, the base URL is typically found at: paths -> /v1/forecast -> servers -> url
- Example: "https://api.open-meteo.com"

### Step 2: Identify the endpoint and method
- Find the correct path (e.g., /v1/forecast)
- Identify the HTTP method (get, post, etc.)

### Step 3: Extract required and optional parameters
- Check the 'parameters' section for the endpoint
- Note which are required (latitude, longitude for /v1/forecast)
- Include optional parameters as needed (current_weather, hourly, daily, temperature_unit, timezone)

### Step 4: Call the tool with complete information

Example:
User: "What's the weather forecast for New York?"
Steps:
1. Extract from spec: baseUrl = "https://api.open-meteo.com" (from paths["/v1/forecast"].servers[0].url)
2. Endpoint = "/v1/forecast", method = "GET"
3. NYC coordinates: lat: 40.7128, lon: -74.0060
4. Call openapi tool:
   {
     baseUrl: "https://api.open-meteo.com",
     endpoint: "/v1/forecast",
     method: "GET",
     queryParams: { 
       latitude: 40.7128, 
       longitude: -74.0060, 
       current_weather: true 
     }
   }

## Important Notes:
- ALWAYS extract the base URL from the OpenAPI spec - never assume or hardcode it
- Read the spec carefully for required vs optional parameters
- For authentication, extract from securitySchemes and pass via headers parameter
- Always cite specific sections of the spec when answering questions

If the spec doesn't contain the requested information, say you don't know.`,
  tools: {
    openapi: openapiTool,
  },
  stopWhen: stepCountIs(10),
});
Enter fullscreen mode Exit fullscreen mode

I also publish this tool as npm package called openapi-http-tool
This package contains a system prompt generator that takes OpenAPI spec, the OpenAPI tool itself, and a the View component for Vercel AI SDK. You can take a look at the Github repository here


To be honest, I wonder why there's no such built-in or native HTTP fetch tool in any popular SDK now. People always keep suggesting MCP for LLMs just to call APIs, which I think is too much of a hassle.

With the advancement of reasoning model, LLM now can already reason on their own to determine which API to call based on the context. Cloudflare suggested the same thing. This approach (HTTP Fetch Tool + OpenAPI Spec) is also quick to implement for anyone, rather than having to wait for API providers to build their own MCP servers.

With the rise of AI Agents, I think webapps should have a standard to have accessible /openapi.yml to inform about all available APIs to AI agents, just like how all websites now have /sitemap.xml to inform about all the pages to Web Crawlers.

Top comments (0)