lumino
AI/ML-powered diagnostic engine for SRE Observability on Konflux and OpenShift. It uses the Model Context Protocol (MCP) and 40+ tools to analyze logs, metrics, and traces, enabling automated RCA and predictive analysis.
claude mcp add --transport stdio spre-sre-lumino-mcp-server python main.py \ --env PYTHONUNBUFFERED="1"
How to use
Lumino is an MCP server that exposes a rich set of AI-assisted tooling for SREs and DevOps teams to observe, analyze, and optimize Kubernetes, OpenShift, and Tekton environments. Once running, Lumino registers a suite of 37 specialized MCP tools that can monitor cluster health, query resources, analyze logs, perform root-cause analysis on failed pipelines, and run predictive simulations to forecast bottlenecks and potential issues. Users interact with Lumino through natural language prompts or MCP client integrations to invoke targeted capabilities such as RCA reports for failed runs, cross-cluster pipeline tracing, log anomaly detection, and what-if simulations to assess the impact of configuration changes before deploying them. Lumino can operate in local stdio mode for direct MCP client usage or adapt to Kubernetes deployments with HTTP streaming transport for scalable environments.
To use Lumino, install the MCP client of your choice, clone the Lumino MCP Server repository, and run the server locally or within your cluster. Then connect via your MCP client and begin issuing prompts like: “List all namespaces in my Kubernetes cluster” or “Generate an RCA for the latest pipeline failure in namespace ci-cd.” Lumino will route requests to the appropriate internal tools, perform analysis, and return structured results and actionable insights.
How to install
Prerequisites:
- Python 3.10+ installed on the host
- An MCP client (e.g., Claude Desktop, Claude Code CLI, Gemini CLI, or Cursor IDE)
- Optional: uv for dependency management and fast iteration
Installation steps:
-
Clone the repository: git clone https://github.com/spre-sre/lumino-mcp-server.git cd lumino-mcp-server
-
Install dependencies (recommended using uv): uv sync
If you prefer a traditional Python approach: python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -e .
-
Run the server: uv run python main.py
or simply: python main.py (local stdio mode by default)
-
Verify access with an MCP prompt or a simple query such as: "List all namespaces in my Kubernetes cluster"
Additional notes
Environment and transport options:
- For Kubernetes deployments, Lumino can auto-detect the environment and use HTTP streaming transport when KUBERNETES_NAMESPACE is set.
- In local development, Lumino runs in stdio mode suitable for direct MCP client integration.
- You can enable verbose/unbuffered Python output with PYTHONUNBUFFERED=1 to improve log readability.
Common issues and tips:
- Ensure kubeconfig is accessible and has read permissions for namespaces, pods, and resources you intend to query.
- If running in a virtual environment, remember to activate it before starting the server.
- When integrating with Claude or other MCP clients, ensure the client is configured to connect to the Lumino MCP server and that the .mcp.json configuration (if used) points to the correct Python entrypoint.
- If you modify code or dependencies, reinstall with pip install -e . or rerun uv sync to refresh the environment.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP