meta-prompt
Turn any MCP Client into a "multi-agent" system (via prompting)
claude mcp add --transport stdio tisu19021997-meta-prompt-mcp-server uvx --directory path/to/your/meta-prompt-mcp run mcp-meta-prompt
How to use
Meta Prompt MCP turns a single language model into a collaborative two-role workflow by switching between a Conductor and Expert mode within one model. The Conductor analyzes a complex problem, breaks it down into subtasks, and delegates them to Expert roles (such as Python Programmer, Code Reviewer, Creative Writer). The Expert outputs are then compiled, critiqued, and integrated to produce a final solution. This implementation keeps the collaboration self-contained in a single model call, using specialized system prompts to guide the Conductor and Expert modes. To use it, configure your MCP client (e.g., Cursor or Claude Desktop) to point at the uv-based server and invoke the meta-model prompt entry point. The official entry point is meta_model_prompt within the meta-prompting server, which activates the Conductor/Expert workflow for the given user prompt. The system expects the prompt to begin with the meta_model_prompt call and then supply your query for processing. The server’s design emphasizes robust reasoning, self-critique, and structured delegation to the internal Expert roles to improve the quality of the result.
How to install
Prerequisites:
- Git
- Python environment with uv available (via Astral uv)
Steps:
- Install uv if you don’t have it:
- macOS / Linux: curl -LsSf https://astral.sh/uv/install.sh | sh
- Windows (PowerShell): powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
- Clone the repository: git clone https://github.com/tisu19021997/meta-prompt-mcp-server.git . cd meta-prompt-mcp-server
- Install dependencies (via uv in the target environment around your meta-prompt-mcp directory):
- Follow the project’s usual uv installation and environment setup steps as described in the repository’s docs.
- Run the MCP server via the configured uv workflow per the mcp_config example (see the README for details on your environment and directory structure).
Note: The approach relies on the uv package manager for Python-based MCP execution in this project; ensure uv is installed and accessible in your shell, and that the target meta-prompt-mcp directory contains the necessary MCP configuration (mcp-meta-prompt) to start the server.
Additional notes
Tips and considerations:
- The server uses a single LM to simulate both Conductor and Expert roles; the design includes a fallback when an independent expert call isn’t possible in client environments.
- If your client cannot support the ctx.sample() mechanism for true expert calls, expect the expert output to be embedded within the conductor’s response.
- Ensure your client is configured to invoke the meta_model_prompt entry point to activate the workflow.
- Environment variables can be added under the mcp_config entry as needed (placeholders provided if you have environment-specific configurations).
- If you change the directory or server name, reflect those changes in both your client configuration and the mcp_config accordingly.
Related MCP Servers
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
mcp-yfinance
Real-time stock API with Python, MCP server example, yfinance stock analysis dashboard
pfsense
pfSense MCP Server enables security administrators to manage their pfSense firewalls using natural language through AI assistants like Claude Desktop. Simply ask "Show me blocked IPs" or "Run a PCI compliance check" instead of navigating complex interfaces. Supports REST/XML-RPC/SSH connections, and includes built-in complian
cloudwatch-logs
MCP server from serkanh/cloudwatch-logs-mcp
servicenow-api
ServiceNow MCP Server and API Wrapper
the -company
TheMCPCompany: Creating General-purpose Agents with Task-specific Tools