archestra
Secure cloud-native MCP registry, gateway & orchestrator
claude mcp add --transport stdio archestra-ai-archestra docker run -i archestra/platform \ --env ARCHESTRA_QUICKSTART="true"
How to use
Archestra MCP server provides a centralized platform for running private MCPs with security-first controls, observability, and governance. The Docker-based quickstart pulls the official Archestra platform image and runs the container exposing a web UI and API endpoints for managing MCPs, tokens, API keys, and cost controls. Use the included observability, governance, and dynamic tool features to monitor usage, enforce data access policies, and plug in private MCPs for your organization. After starting the container, you can access the web UI and begin configuring MCPs, registering private MCPs, and leveraging the built-in chat-style interfaces for interacting with MCPs through the orchestrator.
How to install
Prerequisites:
- Docker installed on your host (Docker Desktop or Docker Engine).
- Sufficient privileges to run containers and expose ports.
Installation steps:
-
Pull the Archestra platform image: docker pull archestra/platform:latest
-
Run the container in quickstart mode (example with required ports and volumes): docker run -p 9000:9000 -p 3000:3000
-e ARCHESTRA_QUICKSTART=true
-v /var/run/docker.sock:/var/run/docker.sock
-v archestra-postgres-data:/var/lib/postgresql/data
-v archestra-app-data:/app/data
archestra/platform -
Verify the service is up by visiting:
- UI: http://localhost:9000
- API/docs: http://localhost:9000/api/docs (or the provided documentation URL in the UI)
-
(Optional) If you want to run a more customized setup, consult the Quickstart Guide linked in the README for environment variables and advanced docker run options.
Additional notes
Tips and considerations:
- The ARCHESTRA_QUICKSTART environment variable enables a simplified startup mode suitable for testing and demos. For production, use your normal deployment workflow and configure persistent storage and proper networking.
- Expose only the needed ports (default 9000 for UI and 3000 for API) and consider securing them behind a reverse proxy with TLS.
- When running with Docker, you can mount volumes to persist data (e.g., /var/lib/postgresql/data and /app/data).
- Use the private MCP registry and Kubernetes orchestration features described in the docs to manage MCP deployment at scale.
- Review environment variables and resource limits to tailor performance and security to your environment.
Related MCP Servers
lobehub
The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.
chatgpt-on-wechat
CowAgent是基于大模型的超级AI助理,能主动思考和任务规划、访问操作系统和外部资源、创造和执行Skills、拥有长期记忆并不断成长。同时支持飞书、钉钉、企业微信应用、微信公众号、网页等接入,可选择OpenAI/Claude/Gemini/DeepSeek/ Qwen/GLM/Kimi/LinkAI,能处理文本、语音、图片和文件,可快速搭建个人AI助手和企业数字员工。
repomix
📦 Repomix is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
ai-guide
程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站
Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.