k8m
一款轻量级、跨平台的 Mini Kubernetes AI Dashboard,支持大模型+智能体+MCP(支持设置操作权限),集成多集群管理、智能分析、实时异常检测等功能,支持多架构并可单文件部署,助力高效集群管理与运维优化。
claude mcp add --transport stdio weibaohui-k8m ./k8m
How to use
k8m is an AI-driven, lightweight Kubernetes dashboard designed to simplify cluster management. It integrates a multi-cluster MCP toolset and an embedded AI assistant to help you interact with Kubernetes resources, run commands, view logs, and manage access across clusters. The server bundles a wide range of built-in MCP-enabled capabilities, including multi-cluster management, APIGateway and OpenKruise support, CRD discovery, Helm integration, and access control integration with large-model workflows. You can access the web UI at the default port and use the built-in tools to manage clusters, run kubectl-like operations, inspect resources, and trigger AI-assisted actions and explanations.
How to install
Prerequisites:
- A machine or VM with a supported OS (Linux, macOS, or Windows) and internet access
- A downloaded release of k8m or access to its binary build
- Optional: Docker for container-based deployment via docker-compose
Installation steps (binary release):
- Download the latest release from GitHub: https://github.com/weibaohui/k8m/releases
- Make the binary executable: chmod +x k8m
- Run the server: ./k8m
- Open your browser and navigate to http://127.0.0.1:3618 (default credentials: username: k8m, password: k8m)
Alternative: Docker compose (recommended for quick local trials):
- Use the provided docker-compose snippet:
services: k8m: container_name: k8m image: registry.cn-hangzhou.aliyuncs.com/minik8m/k8m restart: always ports: - "3618:3618" environment: TZ: Asia/Shanghai volumes: - ./data:/app/data
- Start the service: docker-compose up -d
- Access the UI at http://localhost:3618 with the default credentials mentioned above.
Additional notes
- Default port: 3618. If you run behind a reverse proxy or in a different environment, adjust the port mapping accordingly.
- The server ships with MCP-enabled tooling to perform a wide range of Kubernetes operations through model-assisted workflows. You can log actions and leverage AI to translate YAML attributes, explain resource details, and run recommended commands.
- If you need debugging information about the AI integration, start the server with higher verbosity, e.g. ./k8m -v 6, to capture more detailed logs.
- For multi-cluster deployments, ensure kubeconfig is accessible and that the server has permissions to read the clusters you intend to manage. The UI supports auto-registration of clusters and heartbeat-based reconnection.
- If using Docker, ensure the volume mappings provide persistence for data under ./data to avoid losing configuration on container restarts.
Related MCP Servers
open-webui
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
AutoDocs
We handle what engineers and IDEs won't: generating and maintaining technical documentation for your codebase, while also providing search with dependency-aware context to help your AI tools understand your codebase and its conventions.
chatgpt-copilot
ChatGPT Copilot Extension for Visual Studio Code
autoteam
Orchestrate AI agents with YAML-driven workflows via universal Model Context Protocol (MCP)