mcp_S3_server
This repository provides an implementation of a Model Context Protocol (MCP) server for AWS S3, enabling AI models, particularly Large Language Models (LLMs), to securely interact with S3 buckets. The server offers a standardized interface to list S3 buckets, list objects within buckets, and download file contents.
claude mcp add --transport stdio engrzulqarnain-mcp_s3_server python -m mcp_s3_server \ --env AWS_ACCESS_KEY_ID="Your AWS Access Key ID" \ --env AWS_DEFAULT_REGION="Your AWS region, e.g., us-east-1" \ --env AWS_SECRET_ACCESS_KEY="Your AWS Secret Access Key"
How to use
This MCP server provides a standardized interface for AI models to securely interact with AWS S3. It exposes capabilities to list S3 buckets, list objects within a bucket, and download the contents of specific objects. The server is designed to work within the Model Context Protocol ecosystem, enabling seamless requests from LLMs or AI apps to query bucket metadata, browse object listings, and fetch file data for downstream processing. To start, ensure AWS credentials are configured and accessible to the server process. Then run the server and expose the MCP endpoints that models can call via the MCP protocol.
Once running, you can integrate the MCP S3 server with your AI workflow by issuing MCP-style requests to list buckets, retrieve objects in a bucket, or download file contents. This enables data discovery and retrieval within AI prompts, enabling tasks such as data analysis, document processing, and automated retrieval of PDFs or other files from S3 for model-driven workflows.
How to install
Prerequisites:
- Python 3.10 or higher
- AWS credentials configured (via ~/.aws/credentials, environment variables, or IAM roles)
- Optional but recommended: uv package manager for Python installations
Option A: Install as a Python package (recommended for MCP workflows)
- Install uv (optional, recommended):
curl -LsSf https://astral.sh/uv/install.sh | sh # Unix/macOS
or Windows: powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
- Install the MCP S3 server package from PyPI: pip install mcp-s3-server
- Run the MCP S3 server: python -m mcp_s3_server
Option B: Development installation (from source)
- Clone the repository: git clone https://github.com/ENGRZULQARNAIN/mcp_s3_server.git cd mcp_s3_server
- Install with uv (recommended) or via pip in editable mode:
uv sync
or
pip install -e . - Run the MCP S3 server: python -m mcp_s3_server
Configuration:
- Ensure AWS credentials are accessible to the process (environment variables, credentials file, or IAM roles).
- The server will use the configured AWS account to list buckets, list objects, and fetch file contents.
Additional notes
Tips and considerations:
- If you encounter AWS permission errors, verify that the credentials have permissions for s3:ListBuckets, s3:ListObjectsV2, and s3:GetObject.
- You may want to cap the number of objects returned by listing operations if applicable in your MCP deployment. Adjust AWS API query parameters accordingly if supported by your setup.
- For large buckets or large objects, consider streaming downloads or handling pagination as needed in your client application.
- Environment variables: ensure AWS_DEFAULT_REGION is set; avoid embedding credentials in code. Use roles or credentials files where possible.
- If running behind a firewall or in restricted environments, ensure outgoing access to AWS S3 endpoints is allowed.
Related MCP Servers
evo-ai
Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.
dremio
Dremio MCP server
mcp-android -python
MCP Android agent - This project provides an *MCP (Model Context Protocol)* server for automating Android devices using uiautomator2. It's designed to be easily plugged into AI agents like GitHub Copilot Chat, Claude, or Open Interpreter to control Android devices through natural language.
okta
The Okta MCP Server is a groundbreaking tool built by the team at Fctr that enables AI models to interact directly with your Okta environment using the Model Context Protocol (MCP). Built specifically for IAM engineers, security teams, and Okta administrators, it implements the MCP specification to help work with Okta enitities
Common_Chronicle
Common Chronicle turns messy context into structured, sourced timelines.
Unified -Tool-Graph
Instead of dumping 1000+ tools into a model’s prompt and expecting it to choose wisely, the Unified MCP Tool Graph equips your LLM with structure, clarity, and relevance. It fixes tool confusion, prevents infinite loops, and enables modular, intelligent agent workflows.