go-openapi-exploration
MCP server from SagenKoder/go-openapi-exploration-mcp-server
claude mcp add --transport stdio sagenkoder-go-openapi-exploration-mcp-server docker run -i --rm -e OPENAPI_SPEC_URL=https://api.example.com/openapi.json ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest \ --env OPENAPI_SPEC_URL="URL or path to OpenAPI spec"
How to use
This MCP server provides tools that help large language models explore and understand OpenAPI specifications. It supports multiple execution modes (stdio for Claude Desktop, HTTP server, or an interactive CLI) and includes intelligent caching to speed up repeated access to remote specs. The server exposes a set of MCP tools that enable structured exploration of an API: list_categories (categories based on path segments), list_endpoints (endpoints with optional category filtering), show_endpoint (detailed endpoint information including parameters and schemas), get_spec_info (general API information), and show_schema (inspect specific schema components). To use it in Claude or other MCP clients, configure the server address and, if necessary, environment variables such as OPENAPI_SPEC_URL to point to your OpenAPI specification. These tools allow you to programmatically navigate an API without manually inspecting the raw spec.
How to install
Prerequisites:
- Docker (recommended) or a Go-enabled environment if you want to build or run binaries directly.
- Access to an OpenAPI spec URL or a local OpenAPI file.
Installation options:
- Using Docker (recommended):
- Pull and run the image in the desired mode (stdio for MCP clients, http for HTTP server, or interactive for CLI):
docker pull ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
docker run -i --rm
-e OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json
ghcr.io/sagenkoder/go-openapi-exploration-mcp-server:latest
- Download binaries:
- Open the releases page and download the appropriate binary for your platform:
- openapi-mcp-stdio-linux-amd64 (stdio mode)
- openapi-mcp-http-linux-amd64 (HTTP server mode)
- openapi-mcp-interactive-linux-amd64 (interactive CLI mode)
- Run the binary with the required environment variable: OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-stdio
- Build from source (Go):
- Prerequisites: Go installed
- Clone the repository: git clone https://github.com/SagenKoder/go-openapi-exploration-mcp-server.git cd go-openapi-exploration-mcp-server
- Build all modes: ./build.sh
- Or build a specific mode (stdio): go build -o openapi-mcp-stdio ./cmd/openapi-mcp-stdio
- Run a mode (example stdio): OPENAPI_SPEC_URL=https://petstore3.swagger.io/api/v3/openapi.json ./openapi-mcp-stdio
Notes:
- If using Docker, you can also mount a local OpenAPI file and point OPENAPI_SPEC_URL to the mounted path.
- For caching, you can configure OPENAPI_CACHE_DIR to control where specs are cached.
Additional notes
Tips and caveats:
- The server caches remote OpenAPI specs to speed up subsequent requests; ensure the container or environment has write access to the cache directory if you customize OPENAPI_CACHE_DIR.
- Supported execution modes (stdio, HTTP, interactive) are designed for different client types; choose the mode that best fits your workflow (e.g., stdio for Claude Desktop, HTTP for a local API you query, or interactive for a CLI session).
- The OPENAPI_SPEC_URL environment variable is required in all modes; it can be a URL or a path to a local OpenAPI file.
- Multi-architecture builds are supported; ensure you pull the correct image or binary for your platform (amd64 or arm64).
- When using Docker, you can pass the spec URL via environment variables or bind-mount a local spec file and point OPENAPI_SPEC_URL to the mounted path.
Related MCP Servers
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
tiger-cli
Tiger CLI is the command-line interface for Tiger Cloud. It includes an MCP server for helping coding agents write production-level Postgres code.
gopls
MCP server for golang projects development: Expand AI Code Agent ability boundary to have a semantic understanding and determinisic information for golang projects.
kubernetes
A Model Context Protocol (MCP) server for the Kubernetes API.
gcp-cost
💰 An MCP server that enables AI assistants to estimate Google Cloud costs, powered by Cloud Billing Catalog API and built with Genkit for Go