mcp-n8n-builder
🪄 MCP server for programmatic creation and management of n8n workflows. Enables AI assistants to build, modify, and manage workflows without direct user intervention through a comprehensive set of tools and resources for interacting with n8n's REST API.
claude mcp add --transport stdio spences10-mcp-n8n-builder npx -y mcp-n8n-builder \ --env N8N_HOST="http://localhost:5678/api/v1" \ --env N8N_API_KEY="your-n8n-api-key" \ --env SERVER_NAME="n8n-workflow-builder" \ --env SERVER_VERSION="PACKAGE_VERSION_PLACEHOLDER" \ --env OUTPUT_VERBOSITY="concise"
How to use
The mcp-n8n-builder MCP server provides programmatic access to create, read, update, delete, activate, and deactivate n8n workflows through a set of specialized tools. It also enables management of workflow executions and validates workflow structures against the available n8n nodes to prevent errors. Use this server to automate workflow provisioning, modify existing workflows, and retrieve execution histories in a structured, model-aware way. The tools include Node Management for validating node types, Workflow Management for CRUD operations and state changes, and Execution Management for listing and inspecting run data. When using it, you’ll work with concise summaries by default but can request full JSON when you need detailed lineage of a workflow or its nodes.
How to install
Prerequisites:
- Node.js (LTS) and npm installed
- Access to an n8n instance (self-hosted or cloud) with REST API enabled
Installation steps:
-
Prepare your environment (adjust URLs and keys as needed):
export N8N_HOST=http://localhost:5678/api/v1 export N8N_API_KEY=your-n8n-api-key export OUTPUT_VERBOSITY=concise
-
Install and run locally via npm (typical project directory):
npm install
-
Build the project (if required by the project setup):
npm run build
-
Run in development mode (for local testing):
npm run dev
-
Alternatively, run through MCP orchestration using npx (the preferred runtime):
npx -y mcp-n8n-builder
Note: If your environment uses a container or a different orchestrator, ensure the command and environment variables are wired accordingly in your MCP client configuration.
Additional notes
Tips and common considerations:
- The server relies on n8n's REST API; ensure N8N_HOST and N8N_API_KEY are correctly configured and the API is reachable.
- Large n8n workflows can consume significant tokens when retrieved or edited; use OUTPUT_VERBOSITY=concise for listings and fetch only the needed details when possible.
- Use list_workflows to identify target workflows before create/update operations to minimize token usage.
- You can tailor caching via CACHE_ENABLED and CACHE_TTL if your MCP environment supports it to improve performance for frequent reads.
- If you encounter node validation errors, ensure your n8n instance has the required node types installed and accessible to the server.
Related MCP Servers
augments
Comprehensive MCP server providing real-time framework documentation access for Claude Code with intelligent caching, multi-source integration, and context-aware assistance.
mcp-memory-libsql
🧠 High-performance persistent memory system for Model Context Protocol (MCP) powered by libSQL. Features vector search, semantic knowledge storage, and efficient relationship management - perfect for AI agents and knowledge graph applications.
vikunja
Model Context Protocol server for Vikunja task management. Enables AI assistants to interact with Vikunja instances via MCP.
mcp -text-editor
An open source implementation of the Claude built-in text editor tool
grok-faf
First MCP server for Grok | FAST⚡️AF • URL-based AI context • Vercel-deployed
cdp-tools
MCP server that connects AI assistants to Chrome DevTools Protocol for runtime debugging - set breakpoints, inspect variables, monitor network traffic, and automate browser interactions