dify
Production-ready platform for agentic workflow development.
claude mcp add --transport stdio langgenius-dify docker compose up -d
How to use
Dify is an open-source platform for building and deploying LLM-powered applications. This MCP server uses Docker Compose to launch the Dify self-hosted instance, which provides a visual workflow canvas, prompt IDE, RAG pipelines, agent capabilities, and MLOps features. Once started, you can access the Dify dashboard in your browser and begin configuring models, tools, and workflows. The server exposes a full suite of built-in tools (e.g., Google Search, DALL·E, Stable Diffusion, WolframAlpha) and supports multiple model providers and self-hosted deployments. Use the Docker Compose setup to start the services and follow the installation flow in the Dify docs to initialize the environment and connect your data sources.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine.
- Git (optional, for cloning the repository).
Install and run:
-
Clone the repository (or navigate to the dify directory): git clone https://github.com/langgenius/dify.git cd dify/docker
-
Prepare environment (optional): cp .env.example .env
- Edit .env if you need to customize configuration (e.g., database settings, ports).
-
Start the Dify services with Docker Compose: docker compose up -d
-
Access the dashboard: Open http://localhost/install in your browser to initialize and configure the instance.
Additional notes
Notes:
- Ensure Docker and Docker Compose are up to date for best compatibility.
- The initial setup flow at http://localhost/install will guide you through initializing the Dify instance and connecting data sources.
- If you modify environment variables, restart the containers for changes to take effect: docker compose restart.
- Firewall or port mapping may affect access if you customize ports; verify your Docker Compose file and host network settings.
- This MCP setup uses the same Docker-based deployment method as the official self-hosted guide, so consult the Dify docs for advanced configurations (models, RAG, agents, and integrations).
Related MCP Servers
ragflow
RAGFlow is a leading open-source Retrieval-Augmented Generation (RAG) engine that fuses cutting-edge RAG with Agent capabilities to create a superior context layer for LLMs
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
wanwu
China Unicom's Yuanjing Wanwu Agent Platform is an enterprise-grade, multi-tenant AI agent development platform. It helps users build applications such as intelligent agents, workflows, and rag, and also supports model management. The platform features a developer-friendly license, and we welcome all developers to build upon the platform.
easy-vibe
vibe coding from 0 to 1 | vibecoding 零基础教程 | 产品原型、AI 能力集成、前后端开发、多平台应用开发教程