dockerized -template
A reusable Dockerized Python server template implementing the Model Context Protocol (MCP) with Streamable HTTP transport, built using the official MCP Python SDK (v1.8.0+) for easy integration with Large Language Models (LLMs).
claude mcp add --transport stdio zantis-dockerized-mcp-server-template docker run -i zantis-dockerized-mcp-server-template \ --env PORT="8080" \ --env LOG_LEVEL="INFO" \ --env MCP_ENDPOINT="/mcp"
How to use
This MCP server is a Python-based implementation of the Model Context Protocol (MCP) packaged to run inside a Docker container. It uses Streamable HTTP for real-time, stateless transportation of MCP interactions, allowing LLMs to query data and invoke tools exposed by the server in a scalable manner. The repository template demonstrates how to build, run, and validate an MCP server that exposes a simple tool via the MCP framework. You can access the server at the configured endpoint (by default http://localhost:8080/mcp) once the container is up. The included example shows a Python tool defined with @mcp.tool() that adds two numbers; clients can invoke this tool through MCP requests and receive streamed responses as data becomes available. This setup is well-suited for deployment with serverless or containerized environments where stateless transports are preferred for efficiency and scalability.
How to install
Prerequisites:
- Docker and Docker Compose installed on your system
- Basic familiarity with Docker workflows
Installation steps:
-
Clone the repository or pull the template docker image you plan to use.
-
If using the Docker-based template, ensure you have a compatible Docker image name. Update the image reference if needed in your environment (the example uses zantis-dockerized-mcp-server-template).
-
Start the server with Docker Compose or a direct Docker run as appropriate for your environment. If using Docker Compose, you can use:
docker-compose up --build
- If you want to run the server directly with Python (outside Docker) follow these steps:
# Setup Python environment and dependencies
python3 -m venv venv
source venv/bin/activate
pip install -r src/requirements.txt
# Run the server
python src/server.py
- Verify the server is reachable at http://localhost:8080/mcp or the port you configured. You can also inspect logs to confirm Streamable HTTP is active and the MCP tool is registered.
Additional notes
Notes and tips:
- This template relies on Streamable HTTP for transport; ensure your client tooling supports streaming responses from MCP.
- If you customize tools, decorate functions with @mcp.tool() and provide clear type hints for arguments and return values to facilitate automatic client generation.
- When deploying with Docker, consider exposing the container port 8080 to your host or orchestrator and setting appropriate environment variables for port and endpoint paths.
- Common issues: mismatched API versions between the Python MCP SDK and your client, incorrect tool registration, or firewall rules blocking port 8080. Check MCP and Python SDK docs for version compatibility.
- You can extend the docker template by editing the Dockerfile and server.py to add more tools or data sources as needed.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP