cloud-native
High-Performance Kubernetes MCP Server | 210+ tools | AI LLM integrations | prometheus, helm, grafana
claude mcp add --transport stdio mahmut-abi-cloud-native-mcp-server docker run -d --name cloud-native-mcp-server -p 8080:8080 -v ~/.kube:/root/.kube:ro mahmutabi/cloud-native-mcp-server:latest --mode=sse --addr=0.0.0.0:8080
How to use
This MCP server provides a high-performance Model Context Protocol interface for Kubernetes and cloud-native infrastructure management. It exposes a comprehensive set of services (Kubernetes, Grafana, Prometheus, Kibana, Elasticsearch, Helm, Alertmanager, Jaeger, OpenTelemetry, Utilities) and a large collection of MCP tools (210+). The server supports multiple communication modes, including Server-Sent Events (SSE) and a streamable HTTP variant, and includes smart caching, authentication options, auditing, and input sanitization. To start quickly, deploy the Docker image and expose port 8080; clients can then connect to the SSE endpoints to aggregate or target individual services, such as /api/kubernetes/sse or /api/grafana/sse. Use the aggregate endpoint to fetch a union of all services, or call individual endpoints for fine-grained access.
How to install
Prerequisites:
- Docker installed and running
- Optional: git and Go if you prefer building from source
Installation steps:
-
Pull and run the Docker image (examples below): docker pull mahmutabi/cloud-native-mcp-server:latest docker run -d --name cloud-native-mcp-server -p 8080:8080 -v ~/.kube:/root/.kube:ro mahmutabi/cloud-native-mcp-server:latest --mode=sse --addr=0.0.0.0:8080
-
Build from source (optional): git clone https://github.com/mahmut-Abi/cloud-native-mcp-server.git cd cloud-native-mcp-server make build ./cloud-native-mcp-server --mode=sse --addr=0.0.0.0:8080
-
Verify startup: Access http://localhost:8080/api/aggregate/sse or run a smoke test against the SSE endpoint.
Note: The server can be started in SSE mode by default. You can also run in streamable-http mode by passing --mode=streamable-http.
Additional notes
Tips and common considerations:
- The server exposes a rich set of endpoints under /api/* with SSE by default. Use /api/aggregate/sse for a combined view of all services.
- Authentication options include API Keys, Bearer tokens (JWT), and Basic Auth. Ensure you configure API keys or tokens in your client requests.
- If you run in Docker, map the Kubernetes config dir (~/.kube) into the container to allow Kubernetes cluster access.
- For production, monitor performance, tune the LRU cache TTL, and consider enabling audit logging for tool calls.
- If you encounter port binding issues, ensure the host port (e.g., 8080) is free and that the container has necessary network access.
- The build and deployment guides in the repo provide detailed configuration recommendations for deployment scenarios and security hardening.
Related MCP Servers
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
tiger-cli
Tiger CLI is the command-line interface for Tiger Cloud. It includes an MCP server for helping coding agents write production-level Postgres code.
gopls
MCP server for golang projects development: Expand AI Code Agent ability boundary to have a semantic understanding and determinisic information for golang projects.
kubernetes
A Model Context Protocol (MCP) server for the Kubernetes API.
gcp-cost
💰 An MCP server that enables AI assistants to estimate Google Cloud costs, powered by Cloud Billing Catalog API and built with Genkit for Go