multi-cloud-finops
Cross-cloud FinOps server that brings visibility and control to AWS, GCP, and Azure in one place
claude mcp add --transport stdio eazy-ops-multi-cloud-finops-mcp-server python -m mcp.server.fastmcp
How to use
This MCP server, FastMCP, acts as a multi-cloud FinOps assistant. It connects Gemini-powered assistants with cost insights across AWS, GCP, and Azure, enabling natural language queries to generate cost breakdowns, run FinOps audits, and produce usage summaries all from your local environment. The server supports a CLI- and FastAPI-compatible architecture, allowing integration with MCP clients and automation pipelines for budgeting, reporting, and optimization prompts. You can ask questions like how much was spent on a given cloud, request a cross-cloud audit, or obtain a consolidated budget status across all providers. To use it, start the MCP server and connect your MCP client (e.g., Claude Desktop, Amazon Q, or any compatible MCP client) to pose prompts and receive structured responses with actionable recommendations.
How to install
Prerequisites:
- Python 3.11+ installed on your machine
- Poetry for dependency management
- Access to AWS CLI, Google Cloud SDK (gcloud), and Azure CLI (az) if you intend to use real credentials
Installation steps:
-
Clone the repository git clone https://github.com/Eazy-Ops/multi-cloud-finops-mcp-server.git cd multi-cloud-finops-mcp-server
-
Install Python dependencies with Poetry poetry install
-
Activate the Poetry-managed virtual environment poetry shell
-
Run the MCP server (examples)
Option A: using the MCP entry point
poetry run python -m mcp.server.fastmcp
Option B: direct Python invocation (if preferred)
python -m mcp.server.fastmcp
Note: If you prefer running the server via CLI prompt, you can also start the main module directly if you have a suitable entry point configured via your runtime environment.
Additional notes
Environment and credentials:
- The server uses local SDKs/CLIs for credentials; ensure AWS, GCP, and Azure CLIs are installed and configured if you plan to query real data.
- AWS: run aws configure --profile <profile>
- GCP: gcloud auth application-default login or provide a service_account_key_path in your calls
- Azure: az login; for service principals, set AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET
Tips:
- Keep your credentials local; the MCP server aims to avoid leaking credentials outside your machine.
- Use the provided CLI prompts and natural-language prompts to generate cost analyses, audits, and budget summaries across providers.
- If you add new cloud accounts, ensure corresponding CLI tools are authenticated and configured before issuing queries.
Related MCP Servers
aws-finops
An MCP (Model Context Protocol) server that brings powerful AWS FinOps capabilities directly into your AI assistant. Analyze cloud costs, audit for waste, and get budget insights using natural language, all while keeping your credentials secure on your local machine.
diagram
An MCP server that seamlessly creates infrastructure diagrams for AWS, Azure, GCP, Kubernetes and more
mcp_autogen_sse_stdio
This repository demonstrates how to use AutoGen to integrate local and remote MCP (Model Context Protocol) servers. It showcases a local math tool (math_server.py) using Stdio and a remote Apify tool (RAG Web Browser Actor) via SSE for tasks like arithmetic and web browsing.
ms-sentinel
MCP server for Microsoft Sentinel. Enables access to Sentinel logs, incidents, analytics, and Entra ID data via a modular, queryable interface. Strictly non-production. Designed for use with Claude and other LLMs.
gcp-storage
A Model Context Protocol (MCP) server that provides seamless integration with Google Cloud Storage, enabling AI assistants to perform file operations, manage buckets, and interact with GCS resources directly.
AIFoundry Connector-FabricGraphQL
MCP Client and Server apps to demo integration of Azure OpenAI-based AI agent with a Data Warehouse, exposed through GraphQL in Microsoft Fabric.