magic
Super Magic. The first open-source all-in-one AI productivity platform (Generalist AI Agent + Workflow Engine + IM + Online collaborative office system)
claude mcp add --transport stdio dtyq-magic python -m magic.server \ --env LOG_LEVEL="INFO" \ --env REDIS_URL="redis://localhost:6379" \ --env DATABASE_URL="postgres://user:pass@host:port/dbname" \ --env OPENAI_API_KEY="your-openai-api-key"
How to use
Magic is an open-source, all-in-one AI productivity platform that combines multiple AI-assisted capabilities into a single ecosystem. The server exposes tools for orchestrating AI workflows (Magic Flow), managing intelligent conversations (Magic IM), and integrating with enterprise communication channels and data sources. With the Magic MCP server running, you can start building, deploying, and running AI agents that can plan, execute tasks, and autonomously correct errors across a matrix of components. The platform emphasizes interoperability among components, enabling agents to trigger workflow steps, query knowledge bases, and interact with users or other services through a unified API surface.
How to install
Prerequisites:
- Python 3.8+ and pip
- Git
- Optional: PostgreSQL/Redis if you plan to use persistent storage and caching
Step-by-step:
-
Clone the repository: git clone https://github.com/dtyq/magic.git cd magic
-
Create a virtual environment and install dependencies: python -m venv venv source venv/bin/activate # on Unix or .\venv\Scripts\activate # on Windows pip install -r requirements.txt
-
Configure environment variables (create a .env file or export variables): export OPENAI_API_KEY=your-openai-api-key export DATABASE_URL=postgres://user:pass@host:port/dbname export REDIS_URL=redis://localhost:6379
-
Run the MCP server: python -m magic.server
-
Optional: run with a specific config file or docker if provided by the project docs. Refer to the repository's documentation for any project-specific command-line options.
Additional notes
Environment variables and configuration details may vary by deployment. Common variables include OPENAI_API_KEY for model access, DATABASE_URL for persistent storage, and REDIS_URL for caching/queueing. If you encounter connection or authentication issues, verify network access to your datastore and that API keys are correctly set. When running in production, consider configuring TLS, proper secret management, and rate-limiting on API endpoints. If the server exposes admin or diagnostic endpoints, secure them behind authentication and restrict access via firewall rules.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
astron-agent
Enterprise-grade, commercial-friendly agentic workflow platform for building next-generation SuperAgents.
astron-rpa
Agent-ready RPA suite with out-of-the-box automation tools. Built for individuals and enterprises.
easy-vibe
vibe coding from 0 to 1 | vibecoding 零基础教程 | 产品原型、AI 能力集成、前后端开发、多平台应用开发教程
moling
MoLing is a computer-use and browser-use based MCP server. It is a locally deployed, dependency-free office AI assistant.
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation