deploystack
Open source MCP hosting - deploy MCP servers to HTTP endpoints for n8n, Dify, Voiceflow, and any MCP client.
claude mcp add --transport stdio deploystackio-deploystack node server.js \ --env PORT="8080" \ --env MCP_API_KEY="your-api-key" \ --env MCP_VAULT_URL="https://vault.yourorg.example"
How to use
DeployStack exposes MCP servers as HTTP endpoints so your MCP clients (n8n, Dify, Voiceflow, Langflow, Claude Code, Cursor, and any HTTP/MCP client) can interact with the underlying MCP server via standard HTTP calls. The platform handles converting between the traditional stdio-based MCP protocol and HTTP/SSE endpoints, enabling workflow automation tools to communicate without local processes or port management. The included features—credential vault, RBAC, audit logging, and a centralized catalog—aim to simplify onboarding, security, and visibility. To use this server, host the DeployStack MCP gateway (as described in the installation steps) and point your MCP client to the generated HTTP endpoint. From there you can discover available tools, execute specific MCP tools, and manage access for your team. Typical use involves wiring a client workflow (e.g., n8n or Langflow) to call discover_mcp_tools to identify relevant capabilities, then using execute_mcp_tool to run the desired tool against the target MCP server.
How to install
Prerequisites:
- A server with Node.js installed (recommended for this deployment pattern).
- Git access and basic command-line usage.
- Internet access to fetch dependencies.
Install steps:
- Prepare the environment
- Ensure you have a dedicated user or container with Node.js installed.
- Create a project directory and navigate into it.
- Install dependencies (example)
-
If this project provides a package.json, run:
npm install
- Configure environment
-
Create a .env file or export environment variables as needed (example placeholders shown in mcp_config):
PORT=8080 MCP_VAULT_URL=https://vault.yourorg.example MCP_API_KEY=your-api-key
- Run the MCP gateway
-
Start the server using Node.js:
node server.js
- Verify the endpoint
- Access http://your-host:8080 to ensure the MCP HTTP endpoint is reachable.
- Optional: Run via Docker or other runtimes
- If a containerized option is provided by the project, follow the repository's Docker usage guide to build and run the image and expose the appropriate port.
Additional notes
Tips and notes:
- Replace placeholder env values with your actual vault URL and API key to enable credential injection and secure access.
- Ensure your firewall allows inbound traffic on the port you expose (default 8080 in the example).
- Use the built-in Vault to manage API keys securely; avoid embedding secrets directly in config files.
- If you encounter stdio-to-HTTP migration issues, verify that the endpoint is reachable and that the MCP client version is compatible with DeployStack releases.
- RBAC settings control who can access which MCP servers and tools; configure roles carefully to minimize privilege scope.
- Check audit logs regularly to monitor tool usage and detect unusual activity.
- When upgrading, review breaking changes related to tool discovery and execution APIs.
Related MCP Servers
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
Matryoshka
MCP server for token-efficient large document analysis via the use of REPL state
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
Agentic -Skill
Agentic-MCP, Progressive MCP client with three-layer lazy loading. Validates AgentSkills.io pattern for efficient token usage. Use MCP without pre-install & wasting full-loading
Email MCP server with full IMAP + SMTP support — read, search, send, manage, and organize email from any AI assistant via the Model Context Protocol
civitai
A Model Context Protocol server for browsing and discovering AI models on Civitai