n8n -demo
Cross-platform n8n workflow automation integrated with Model Context Protocol (MCP) using Docker
claude mcp add --transport stdio tysoncung-n8n-mcp-demo docker compose up -d
How to use
This MCP server implementation provides a Docker-based FastAPI simulation of the Model Context Protocol endpoints used to integrate with an n8n workflow. The MCP server exposes endpoints such as GET /health, POST /api/context, and POST /api/execute which allow an external agent (like n8n) to request contextual data and execute actions within a controlled MCP session. The included n8n workflow demonstrates an end-to-end integration where an incoming webhook triggers the MCP context retrieval, and the results drive subsequent actions in the workflow. To use it, first ensure the Docker-based stack is running (via docker-compose up -d as described in the installation steps). Then configure your n8n workflow to call http://mcp-server:8080/api/context for context queries and http://mcp-server:8080/api/execute for action execution, passing the required payload and an API key if enabled by the environment variables. The endpoints are designed to be simple to test with curl or via the n8n HTTP Request node, enabling quick experimentation with MCP-context-driven automation.
How to install
Prerequisites:
- Docker Desktop 20.10+ (Windows/Mac) or Docker Engine on Linux
- Docker Compose v2 or compatible
- Git
Installation steps:
- Clone the repository
git clone https://github.com/yourusername/n8n-mcp-demo.git
cd n8n-mcp-demo
- Configure environment (optional)
cp .env.example .env
# Edit .env to customize credentials and endpoints if needed
- Start the stack
docker-compose up -d
- Verify services
docker-compose ps
Both containers should show Up. Access the MCP server at http://localhost:8080 and n8n at http://localhost:5678 (as per the docker-compose configuration).
If you modify code in the MCP server, restart the stack:
docker-compose restart
Additional notes
Tips and common considerations:
- Ensure the MCP_SERVER_URL in your environment points to the MCP server container name (mcp-server) when using Docker networks. In the demo, the URL is typically http://mcp-server:8080.
- The default environment variables in the README include MCP_API_KEY for authentication; adjust or disable as needed in your .env file.
- If containers fail to start, check for port conflicts (5678 for n8n and 8080 for MCP server) and review logs with docker-compose logs or docker logs on the specific container.
- The integration workflow in n8n assumes the MCP endpoints are reachable from the n8n container; using the Docker network names in your URLs (e.g., http://mcp-server:8080) ensures container-to-container communication.
- When testing locally, you can curl the MCP endpoints directly to validate behavior before wiring them into n8n.
- If you customize the MCP server code inside docker-compose, restart services to apply changes.
Related MCP Servers
mcp-context-forge
An AI Gateway, registry, and proxy that sits in front of any MCP, A2A, or REST/gRPC APIs, exposing a unified endpoint with centralized discovery, guardrails and management. Optimizes Agent & Tool calling, and supports plugins.
ai-dev-tools-zoomcamp
AI Dev Tools Zoomcamp is a free course that helps you use AI tools to write better code, faster. We're starting the first cohort of this course on November 18, 2025! Sign up here to join us 👇🏼
jenkins -enterprise
The most advanced Jenkins MCP server available - Enterprise debugging, multi-instance management, AI-powered failure analysis, vector search, and configurable diagnostics for complex CI/CD pipelines.
n8n-workflows
⚡ Explore 2,053 n8n workflows with a fast, user-friendly documentation system for instant search and analysis capabilities.
docker-swarm
MCP server for Docker Swarm orchestration using FastAPI and Docker SDK
bitbucket-automatic-pr-reviewer
🤖 Automated PR reviews using Claude CLI with Bitbucket webhooks. Features sequential processing, MCP integration, Prometheus metrics, and secure webhook validation. Perfect for teams wanting AI-powered code reviews without API costs.