mcp -template
Cookiecutter template for MCP servers with one-click Render.com deployment - Generate production-ready API integration servers in minutes
claude mcp add --transport stdio pietroperona-mcp-server-template python main.py
How to use
This MCP server template provides a Python-based bridge between Claude AI and external APIs using FastMCP. It exposes a set of six ready-to-use tools that allow Claude to explore, query, create, update, and delete resources from any RESTful API. The server is designed to be deployed with Render or via Docker, and it includes optional authentication, rate limiting, and caching features to help manage API usage. To connect Claude to your API, configure the mcpServers entry in Claude’s MCP settings to point to the local or public URL of the server and rely on the template’s built-in tools to perform common operations such as listing resources, fetching resources by ID, and creating or updating records. The template also supports an SSE endpoint at /sse for Claude Web integration, enabling real-time tool discovery in the browser.
How to install
Prerequisites:
- Python 3.11 or newer
- pip
- Optional: cookiecutter if you want to generate a new project from the template
Install and run the template locally:
- Install cookiecutter (optional if you’re cloning the template directly):
pip install cookiecutter
- Generate a new MCP server project from the template (optional):
cookiecutter https://github.com/pietroperona/mcp-server-template
- Navigate to your project directory (the generated folder usually contains a main.py and a requirements.txt):
cd your-project-name
- Create and activate a Python virtual environment (recommended):
python -m venv venv
source venv/bin/activate # on macOS/Linux
venv\Scripts\activate # on Windows
- Install dependencies:
pip install -r requirements.txt
- Run the MCP server:
python main.py
The server should start and be accessible at http://localhost:8000 (default). If you deploy to Render or Docker, follow the deployment steps in the README to expose the /sse endpoint for Claude Web integration.
Additional notes
- The template includes an SSE endpoint at /sse which Claude Web uses to list and access tools in real-time. Ensure this endpoint is publicly reachable when deploying.
- Environment variables are used for API credentials and config (e.g., API_BASE_URL, API_KEY, BEARER_TOKEN, CLIENT_ID, CLIENT_SECRET, USERNAME, PASSWORD). Populate a .env file or your deployment environment accordingly.
- The six built-in tools are designed to be generic; you can customize their behavior by modifying the tools under the tools/ directory and updating auth/config handling in core/config.py as needed.
- For production deployments (Render or Docker), set appropriate rate limits and caching policies to avoid hitting upstream API limits.
- If you encounter issues with event loop errors in Claude, verify that session management is correctly enabled and that the server process aligns with your Python environment version.
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
mcp-chain-of-draft
Chain of Draft Server is a powerful AI-driven tool that helps developers make better decisions through systematic, iterative refinement of thoughts and designs. It integrates seamlessly with popular AI agents and provides a structured approach to reasoning, API design, architecture decisions, code reviews, and implementation planning.
vibe-check
Stop AI coding disasters before they cost you weeks. Real-time anti-pattern detection for vibe coders who love AI tools but need a safety net to avoid expensive overengineering traps.
substack -plus
The most advanced Substack MCP server. 12 tools, browser auth, rich text support. Not affiliated with Substack Inc.
cc-session-search
MCP server for searching and analyzing Claude Code conversation history
graphql -bridge
A bridge implementation connecting GraphQL APIs with the Model Context Protocol (MCP), enabling seamless integration between GraphQL services and MCP-compatible AI systems. This tool facilitates data exchange and API communication by translating GraphQL operations into MCP-compatible formats.