mcp -dify
Sample MCP Server for Dify AI
claude mcp add --transport stdio yuru-sha-mcp-server-dify npx -y @modelcontextprotocol/server-dify https://your-dify-api-endpoint your-dify-api-key
How to use
This MCP server implements the Model Context Protocol for Dify AI, enabling LLMs to interact with Dify AI's chat completion capabilities through a standardized interface. It also includes a restaurant recommendation tool called meshi-doko and supports conversation context and streaming responses. The server is implemented in TypeScript and can be run via a Node-based workflow (npx) or via Docker, depending on your preference. To use it with Claude Desktop or other MCP clients, configure the mcpServers.dify entry with the appropriate executable and arguments so the client can start the server and communicate using the MCP protocol. The key inputs are the Dify API endpoint and your API key, which are passed to the server to authenticate against Dify.
How to install
Prerequisites:
- Docker (if you plan to run via Docker)
- Node.js and npm (for npx usage)
- Internet access to fetch npm packages
Installation steps (option A: using npx):
- Ensure Node.js and npm are installed on your system.
- Run the MCP server directly via npx (no local install required): npx -y @modelcontextprotocol/server-dify https://your-dify-api-endpoint your-dify-api-key
Option B: using Docker:
- Install Docker on your machine.
- Build or pull the Docker image (as per project guidance): make docker
- Run the container with your Dify endpoint and API key: docker run -i --rm mcp/dify https://your-dify-api-endpoint your-dify-api-key
Notes:
- Replace https://your-dify-api-endpoint and your-dify-api-key with your actual Dify credentials.
- If you use Claude Desktop or other MCP clients, you will typically provide the same endpoint and key via the client configuration as shown in the usage example.
Additional notes
Tips and considerations:
- Keep your Dify API key secure; do not commit it to version control.
- The server supports streaming responses; ensure your client can handle streaming data if you rely on that feature.
- The meshi-doko tool provides restaurant recommendations by interfacing with Dify AI; you can pass LOCATION, BUDGET, query, and optional conversation_id to maintain context.
- If you encounter connectivity issues, verify the Dify endpoint is reachable via HTTPS and that the API key is valid.
- For local development, using npx is convenient to avoid local installs, while Docker provides isolation and reproducibility.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.