openapi -proxy
An MCP server that provides tools for exploring large OpenAPI schemas
claude mcp add --transport stdio nyudenkov-openapi-mcp-proxy uvx nyudenkov-openapi-mcp-proxy
How to use
This MCP server provides tools to explore and manage large OpenAPI schemas without loading entire schemas into your LLM context. It lets you save multiple API configurations (with optional authentication headers), automatically caches schemas, and offers endpoints for listing, searching, and detailing API structures and models. Use the API management tools to add or remove API configurations, then leverage the exploration tools to inspect endpoints and data models, paginate through large results, and filter results to focus on relevant operations or model types. The server is designed to work with MCP-compatible clients, enabling efficient, structured interactions with OpenAPI definitions.
How to install
Prerequisites:
- Python 3.13+ installed on your system
- uv installed (see installation below)
- MCP-compatible client for testing (e.g., Claude Desktop, Claude Code CLI, Cursor, etc.)
Installation steps:
- Clone the repository:
git clone https://github.com/nyudenkov/openapi-mcp-proxy.git
cd openapi-mcp-proxy
- Install uv (if not already installed):
# macOS/Linux (install script)
curl -LsSf https://astral.sh/uv/install.sh | sh
# or via pip
pip install uv
- Install dependencies and set up the server (via uv sync):
uv sync
- Verify installation by starting the server:
uv run python main.py
The server should start without errors, and you can connect using an MCP-compatible client.
Additional notes
Notes and tips:
- API configurations are saved in api_configs.json in the working directory. It stores saved APIs with their URLs and optional descriptions.
- The server supports pagination for endpoints and models with a default page size of 50 items. Use filtering to narrow results by HTTP method, tags, authentication requirements, deprecation status, or model properties.
- If you encounter authentication issues, ensure headers are correctly configured in the API configuration (e.g., {"Authorization": "Bearer token"}).
- When testing with large OpenAPI schemas, rely on the schema caching to reduce repeated downloads and speed up subsequent queries.
- The server runs via stdio and is designed to work with MCP clients; ensure your client is configured to communicate with the server’s endpoints.
- Current recommended runtime is Python 3.13+ with uv; if you switch environments, adjust the command accordingly in mcp_config.
Related MCP Servers
mcpo
A simple, secure MCP-to-OpenAPI proxy server
roampal-core
Outcome-based memory for Claude Code and OpenCode
openapi -generator
A Python tool that automatically converts OpenAPI(Swagger, ETAPI) compatible specifications into fully functional Model Context Protocol (MCP) servers. Generates Docker-ready implementations with support for SSE/IO communication protocols, authentication, and comprehensive error handling. https://pypi.org/project/openapi-mcp-generator/
mcp-cyberbro
Using MCP is fun with Cyberbro!
the -company
TheMCPCompany: Creating General-purpose Agents with Task-specific Tools
txtai-assistant
Model Context Protocol (MCP) server implementation for semantic vector search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic vector database search capabilities. You can use Claude and Cline AI as well.