roundtable
Zero-configuration MCP server that unifies multiple AI coding assistants (Codex, Claude Code, Cursor, Gemini) through intelligent auto-discovery and standardized interface
claude mcp add --transport stdio askbudi-roundtable python -m roundtable_ai
How to use
Roundtable AI MCP Server is a local multi-agent coordination hub. It orchestrates specialized sub-agents (Gemini, Claude, Codex, Cursor, etc.) to solve engineering problems by running tasks in parallel, sharing context, and synthesizing a single, cohesive response. Start the server from your environment and interact with it using the MCP client tooling or IDE integrations that support the MCP protocol. Once running, you can dispatch prompts that delegate parts of a task to multiple sub-agents and receive a unified result that combines analysis, reasoning, and implementation suggestions from the respective tools. The server is designed to work with your existing CLI tools and API subscriptions, avoiding any vendor-specific markup.
You can use Roundtable AI by starting the MCP server and then issuing commands to enable all available agents or select specific ones (for example: gemini, claude, codex, cursor). The workflow typically involves packaging a prompt and context, routing subtasks to sub-agents in parallel, and then aggregating their outputs into a final synthesized answer. IDE integrations and prompt templates provided with the server help you craft multi-agent prompts and define how results should be presented in your IDE or chat interface.
How to install
Prerequisites:
- Python 3.10+ installed on your machine
- pip available in your environment
- Optional: a local environment with your preferred IDE (VS Code, JetBrains, Cursor, etc.)
Installation steps:
-
Create and activate a Python virtual environment (recommended): python -m venv .venv source .venv/bin/activate # macOS/Linux .venv\Scripts\activate # Windows
-
Install the Roundtable MCP server package from PyPI: pip install roundtable-ai
-
Run the MCP server (as defined in mcp_config): python -m roundtable_ai
-
Verify the server is running and listening on the default port (usually localhost:8080 or configured by the tool you use).
-
If your IDE integration requires a configuration file, use the provided defaults or your environment overrides as needed.
Additional notes
Tips and caveats:
- Ensure your Python environment has network access if you’re using remote sub-agents or external APIs.
- If you encounter agent-specific errors, check that the CLI tools for Gemini, Claude, Codex, Cursor, etc., are installed and accessible in your PATH.
- Some installations may require authentication tokens or API keys for your preferred models; keep these credentials secure and load them via environment variables or your IDE’s secret manager.
- You can customize the set of agents via your MCP client by specifying --agents or equivalent options supported by your tooling configuration.
- If running behind a firewall, ensure outbound connections to the sub-agent services are allowed or configure local proxies as needed.
Related MCP Servers
cursor-notebook
Model Context Protocol (MCP) server designed to allow AI agents within Cursor to interact with Jupyter Notebook (.ipynb) files
claude-ipc
AI-to-AI communication protocol for Claude, Gemini, and other AI assistants
swiftlens
SwiftLens is a Model Context Protocol (MCP) server that provides deep, semantic-level analysis of Swift codebases to any AI models. By integrating directly with Apple's SourceKit-LSP, SwiftLens enables AI models to understand Swift code with compiler-grade accuracy.
just
Share the same project justfile tasks with your AI Coding Agent.
mcp -python-template
This template provides a streamlined foundation for building Model Context Protocol (MCP) servers in Python. It's designed to make AI-assisted development of MCP tools easier and more efficient.
voice-status-report
A Model Context Protocol (MCP) server that provides voice status updates using OpenAI's text-to-speech API.