openapi -swagger
Solve AI context window limits for API docs | Convert any Swagger/OpenAPI to searchable MCP server | AI-powered endpoint discovery & code generation | Works with Cursor, Claude, VS Code
claude mcp add --transport stdio salacoste-openapi-mcp-swagger python -m openapi_mcp_swagger
How to use
This MCP server exposes an intelligent interface for OpenAPI/Swagger specifications by converting a Swagger/OpenAPI document into a searchable, context-aware MCP server. It allows your AI agents and MCP-enabled tools to query API endpoints, schemas, and examples without loading the entire Swagger file into memory or your chat context. Once running, you can instruct your MCP-enabled clients to search for endpoints, retrieve schema details, and generate example requests or client code based on the API specification. The server leverages the OpenAPI document to provide accurate type information, parameter details, and response schemas in a concise, AI-friendly format.
How to install
Prerequisites:
- Python 3.11 or newer
- Git
Step-by-step:
-
Clone the repository: git clone https://github.com/salacoste/openapi-mcp-swagger.git cd openapi-mcp-swagger
-
Create and activate a Python virtual environment (optional but recommended): python -m venv venv source venv/bin/activate # Unix/macOS venv\Scripts\activate # Windows
-
Install dependencies (from repository): pip install -r requirements.txt
-
Run the MCP server (as an MCP server backend): python -m openapi_mcp_swagger
-
Provide your Swagger/OpenAPI JSON file as input to the server according to its documentation (the project typically accepts a swagger.json path or URL to load the OpenAPI spec).
-
Connect your MCP-compatible client (Cursor, Claude, VS Code, etc.) to the running server endpoint.
Tip: If the project uses a specific CLI flag or environment variable to pass the OpenAPI file location, consult the README or help output (python -m openapi_mcp_swagger --help) for exact usage.
Additional notes
Notes and tips:
- Ensure your OpenAPI document is valid JSON/YAML and reachable by the server at startup.
- If your API uses authentication schemes in OpenAPI, verify that the MCP client queries include appropriate auth headers as defined by the server's capabilities.
- Monitor for large OpenAPI specs (10MB+); the MCP approach helps AI agents avoid context window overflows by providing targeted queries.
- If you modify the OpenAPI file, restart the MCP server to reload the specification.
- Check for MCP compatibility updates or plugin integrations in the repository for enhanced AI tool support.
Related MCP Servers
Pare
Dev tools, optimized for agents. Structured, token-efficient MCP servers for git, test runners, npm, Docker, and more.
godot
GoPeak — The most comprehensive MCP server for Godot Engine. 95+ tools: scene management, GDScript LSP, DAP debugger, screenshot capture, input injection, ClassDB introspection, CC0 asset library. npx gopeak
kratos
🏛️ Memory System for AI Coding Tools - Never explain your codebase again. MCP server with perfect project isolation, 95.8% context accuracy, and the Four Pillars Framework.
spec-kit
MCP server enabling AI assistants to use GitHub's spec-kit methodology
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management
work-memory
Never lose context again - persistent memory management system for AI-powered workflows across multiple tools