Get the FREE Ultimate OpenClaw Setup Guide →

openapi

OpenAPI definitions, converters and LLM function calling schema composer.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio samchon-openapi npx -y @samchon/openapi

How to use

The @samchon/openapi MCP server dynamically transforms OpenAPI/Swagger documents into LLM function calling schemas that can be executed by your HTTP backend. It normalizes different OpenAPI versions into an emended OpenAPI v3.1 representation, then exposes a consistent interface for generating type-safe function-call schemas that can be invoked by OpenAI, Claude, Qwen, Llama, and other providers. Use this server to quickly expose your existing APIs to AI agents by generating function definitions, parameters, and validation scaffolding that the LLM can leverage to construct correct requests to your backend. The tooling emphasizes type safety, automatic validation, and seamless MCP integration so your backend can be AI-callable without writing bespoke adapters.

To use it, supply your OpenAPI document to the server, which will produce an application consisting of functions that mirror your API endpoints. You can then integrate these functions into your chosen LLM workflow by attaching them as function-call tools to the model's chat or generation interface. The server is designed to work across OpenAI, Claude, Qwen, and other providers, enabling consistent behavior for function calling regardless of provider.

Example workflows include loading a Swagger/OpenAPI document, converting it to the emended OpenAPI format, generating an LLM function calling schema, and then selecting a suitable function to invoke based on the LLM’s outputs. This enables AI-driven orchestration of HTTP backends with validation and type safety baked in.

How to install

Prerequisites:

  • Node.js (LTS version) and npm installed on your machine
  • Internet access to download the package

Install the MCP server (as a local package via NPX):

  1. Install Node.js and npm if not already installed.

  2. Run the MCP server using NPX (this will fetch and run @samchon/openapi):

# Start the MCP server for openapi using npx
npx -y @samchon/openapi
  1. If you want to pin a specific version, you can specify it in the command:
npx -y @samchon/openapi@1.x.y
  1. Alternatively, install locally and run a script that initializes the server as needed for your workflow:
npm install @samchon/openapi
# Then use your own node script to load and transform OpenAPI documents via the library
  1. Ensure your OpenAPI document is accessible (local file or URL) and ready for conversion.

Additional notes

Tips and common considerations:

  • The tool supports Swagger v2.0, OpenAPI v3.0, and OpenAPI v3.1, and emits an emended v3.1 format for consistent downstream usage.
  • The MCP integration enables you to treat the API as an AI-callable service within your multi-provider LLM setup.
  • If you encounter validation errors, verify that your OpenAPI document is well-formed and valid against the OpenAPI specifications; the emended format aims to reduce ambiguity but still relies on correct source semantics.
  • When integrating with an MCP workflow, you can reuse the generated function calling schemas across different LLM providers, simplifying cross-provider AI integration.
  • Environment variables and configuration options can be extended as needed in your deployment context; the default MCP config shown uses the package directly via NPX.

Related MCP Servers

Sponsor this space

Reach thousands of developers