sample-cloud-spend
MCP for AWS Cost Explorer and CloudWatch logs
claude mcp add --transport stdio aws-samples-sample-cloud-spend-mcp-server docker run -i --rm -e AWS_PROFILE -e AWS_REGION -e BEDROCK_LOG_GROUP_NAME -e MCP_TRANSPORT aws-cost-explorer-mcp:latest
How to use
This MCP server exposes AWS Cost Explorer data to Claude Desktop (via Anthropic's Model Context Protocol). It lets you query your AWS spend with natural language and receive structured insights such as daily costs, service breakdowns, and Bedrock usage data from CloudWatch logs. The server can run locally via Docker or be deployed remotely on EC2, and supports both standard MCP transport (stdio) for local usage and remote transport (sse) for EC2 deployments. Once running, you can connect Claude Desktop or a LangGraph-based client to the server and ask questions like: What was my total AWS spend yesterday? Which services contributed most to costs in the last 30 days? How much did I spend on EC2 in us-east-1 over the past week? The server provides tools that map to Cost Explorer queries and Bedrock/log-based usage reports, enabling interactive, natural-language exploration of cloud spend data.
How to install
Prerequisites:
- Python 3.12 (for local server execution if using Python/uv path) or Docker installed on your machine
- AWS credentials configured with Cost Explorer and CloudWatch permissions
- An active AWS account with access to Cost Explorer and Bedrock invocation logs (CloudWatch)
Installation steps (Docker-based deployment):
-
Ensure Docker is installed and running on your machine.
-
Pull and run the MCP server container (as described in the mcp_config). Example: docker run -i --rm
-e AWS_PROFILE=YOUR_AWS_PROFILE
-e AWS_REGION=us-east-1
-e BEDROCK_LOG_GROUP_NAME=YOUR_BEDROCK_CW_LOG_GROUP
-e MCP_TRANSPORT=stdio
aws-cost-explorer-mcp:latest -
If you prefer to run locally with uv (Python/uv) instead of Docker, install uv and set up a virtual environment, then run the server script as shown in the repository's usage notes. The general steps are:
- Install uv: curl -LsSf https://astral.sh/uv/install.sh | sh (macOS/Linux) or the Windows equivalent
- Create and activate a Python venv and install dependencies from pyproject.toml
- Set environment variables (MCP_TRANSPORT, BEDROCK_LOG_GROUP_NAME, AWS_PROFILE, AWS_REGION)
- Run python server.py
-
Configure AWS credentials by placing them in ~/.aws/credentials and ~/.aws/config, or use an appropriate IAM role/profile for your environment.
Additional notes
Notes and tips:
- The MCP wire format uses JSON-RPC 2.0; do not send sensitive data over MCP without proper security measures. When running remotely (EC2), consider using HTTPS for transport where supported by your client configuration.
- If you deploy locally with Docker, ensure the AWS_PROFILE and AWS_REGION environment variables are correctly set so Cost Explorer and CloudWatch access works as expected.
- The Bedrock integration requires that you have a CloudWatch Logs group prepared for model invocation logs; set BEDROCK_LOG_GROUP_NAME accordingly.
- For Claude Desktop integration, you can configure either Docker-based or UV-based local deployments, or remote deployments using SSE transport; ensure MCP_TRANSPORT aligns with your setup.
- If you encounter permission issues, verify that your IAM policies include cost explorer read permissions (CostExplorerReadOnlyAccess) and CloudWatch Logs read access.
Related MCP Servers
Gitingest
mcp server for gitingest
aws-cost-explorer
MCP server for understanding AWS spend
ytt
MCP server to fetch YouTube transcripts
pfsense
pfSense MCP Server enables security administrators to manage their pfSense firewalls using natural language through AI assistants like Claude Desktop. Simply ask "Show me blocked IPs" or "Run a PCI compliance check" instead of navigating complex interfaces. Supports REST/XML-RPC/SSH connections, and includes built-in complian
arch
Arch Linux MCP (Model Context Protocol)
cloudwatch-logs
MCP server from serkanh/cloudwatch-logs-mcp