focus
MCP server for FinOps cloud cost analysis using FOCUS billing data. Query AWS, Azure & GCP costs with AI assistants like Claude.
claude mcp add --transport stdio glassity-focus-mcp docker run -i --rm -v /path/to/your/focus/data:/data:ro -e FOCUS_DATA_LOCATION=/data -e FOCUS_VERSION=1.0 glassity/focus-mcp:latest \ --env FOCUS_VERSION="FOCUS data version (e.g., 1.0, 1.1, 1.2)" \ --env FOCUS_DATA_LOCATION="Location of loaded FOCUS data (default: /data inside container)"
How to use
This MCP server exposes a set of tools for analyzing FOCUS (FinOps Open Cost & Usage Specification) billing data. It provides a suite of MCP capabilities to inspect loaded data, browse and execute predefined analysis queries, and explore the FOCUS schema. You can use tools such as get_data_info to understand what data is loaded, list_use_cases and get_use_case to explore and run predefined analyses, and execute_query for custom SQL-based queries. The server also offers schema-related tools like list_columns and get_column_details to learn about the available FOCUS fields. This makes it easy to interact with complex cloud cost data through natural language prompts or scripted queries, backed by DuckDB-powered analytics and versioned FOCUS support (v1.0 to v1.2).
How to install
Prerequisites:
- Docker installed on your machine or server
- Access to FOCUS data (Parquet format) either locally mounted or accessible via a path inside the container
Step-by-step:
-
Ensure Docker is running on your host.
-
Prepare your FOCUS data location (local path). For example, create a data directory and place Parquet files there.
-
Run the MCP server via Docker using the following command, replacing the data path as needed:
docker run -i --rm
-v /path/to/your/focus/data:/data:ro
-e FOCUS_DATA_LOCATION=/data
-e FOCUS_VERSION=1.0
glassity/focus-mcp:latest -
The MCP server will start and listen for MCP client connections. Configure your MCP client (e.g., Claude Desktop) to point at the running container and use the provided tools.
-
If you need to use S3 or other data sources, extend the Docker run with additional environment variables or a wrapper script per your data sourcing method.
Additional notes
Notes and tips:
- The container expects FOCUS data in Parquet with a location specified by FOCUS_DATA_LOCATION. Use a read-only volume for data to ensure data integrity.
- FOCUS_VERSION should reflect the data schema version you’re using (1.0, 1.1, 1.2).
- If you need to access data from S3 or a cloud provider, you can pass environment variables (e.g., AWS_REGION, AWS_PROFILE, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) to the container as needed for your data source.
- For Claude Desktop integration, ensure the Claude config JSON points to the running container and the proper data location is exposed to the container.
- Logs and runtime diagnostics will help troubleshoot data loading or query issues; check container logs for details.
Related MCP Servers
LibreChat
Enhanced ChatGPT Clone: Features Agents, MCP, DeepSeek, Anthropic, AWS, OpenAI, Responses API, Azure, Groq, o1, GPT-5, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, Code Interpreter, langchain, DALL-E-3, OpenAPI Actions, Functions, Secure Multi-User Auth, Presets, open-source for self-hosting. Active.
aws
A lightweight service that enables AI assistants to execute AWS CLI commands (in safe containerized environment) through the Model Context Protocol (MCP). Bridges Claude, Cursor, and other MCP-aware AI tools with AWS CLI for enhanced cloud infrastructure management.
aws-finops
An MCP (Model Context Protocol) server that brings powerful AWS FinOps capabilities directly into your AI assistant. Analyze cloud costs, audit for waste, and get budget insights using natural language, all while keeping your credentials secure on your local machine.
mcp -aws
A Model Context Protocol server implementation for operations on AWS resources
diagram
An MCP server that seamlessly creates infrastructure diagrams for AWS, Azure, GCP, Kubernetes and more
kusto
MCP server for Azure Data Explorer (Kusto), enabling AI agents to explore, query, and understand telemetry using KQL.