n8n
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
claude mcp add --transport stdio n8n-io-n8n npx -y n8n \ --env N8N_HOST="optional: host for self-hosted access" \ --env N8N_PORT="optional: port (default 5678)" \ --env N8N_PASSWORD="optional: set to protect editor (if using basic auth)" \ --env N8N_BASIC_AUTH_USER="optional: basic auth username" \ --env N8N_BASIC_AUTH_PASSWORD="optional: basic auth password"
How to use
n8n is a workflow automation platform that blends code flexibility with no-code convenience. It enables technical teams to build automated processes by connecting 400+ integrations, writing custom JavaScript or Python code when needed, and orchestrating complex workflows through a visual editor. You can host n8n yourself for control and data privacy, while leveraging native AI capabilities and LangChain-based workflows to build AI-assisted automations.
To get started, install or run n8n via npx as shown in the Quick Start. Once the editor is running, you can create a new workflow, add nodes from the available integrations, and configure expressions, conditions, and data transformations. If you need custom logic, insert code nodes to execute JavaScript or Python, install npm packages within the workflow, or build AI-enabled steps that leverage LangChain and your own data and models. The platform supports enterprise features like permissions, SSO, and air-gapped deployments for larger teams.
How to install
Prerequisites:
- Node.js (recommended) and npm installed on your machine
- Optional: Docker if you prefer containerized deployment
Install and run with npm/npx:
- Ensure Node.js and npm are available
- Run the Quick Start command: npx n8n
Using Docker (alternative):
- Install Docker on your host
- Run the container and expose the editor port: docker volume create n8n_data docker run -it --rm --name n8n -p 5678:5678 -v n8n_data:/home/node/.n8n docker.n8n.io/n8nio/n8n
- Open http://localhost:5678 in your browser
If you need to run within a CI or script, use the npx approach in automation scripts: npx -y n8n
Additional notes
Notes:
- n8n emphasizes self-hosting with a fair-code license; consider data residency and access control if deployed in organizational environments.
- You can enable basic auth or other security measures in the editor or via environment variables when running in containers.
- For AI/LLM workflows, explore LangChain guidance and AI-related nodes to integrate AI models with your data.
- If you modify the workflow data directory, ensure proper backups for reliability in production.
- The community and documentation are valuable resources for troubleshooting and templates.
Related MCP Servers
rulego
⛓️RuleGo is a lightweight, high-performance, embedded, next-generation component orchestration rule engine framework for Go.
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
bytechef
Open-source, AI-native, low-code platform for API orchestration, workflow automation, and AI agent integration across internal systems and SaaS products.
flow-like
Flow-Like: Strongly Typed Enterprise Scale Workflows. Built for scalability, speed, seamless AI integration and rich customization.
yutu
A fully functional MCP server and CLI for YouTube
napi
Software architecture tooling for the AI age