Get the FREE Ultimate OpenClaw Setup Guide →

ai-council

Multi-AI consensus MCP server that queries multiple AI models (OpenAI, Claude, Gemini, custom APIs) in parallel and synthesizes responses to reduce bias and improve accuracy. A Python implementation of the wisdom-of-crowds approach for AI decision making.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio 0xakuti-ai-council-mcp uvx ai-council \
  --env OPENROUTER_API_KEY="..."

How to use

AI Council MCP Server interlinks multiple AI models to provide a robust, consensus-driven answer. It runs several models in parallel, trails anonymous code names to prevent synthesis bias, and uses a synthesizer model to produce a single, comprehensive response drawn from all model inputs. By default, it leverages OpenRouter with Claude Sonnet, Gemini, and DeepSeek, but it can also integrate other compliant OpenAI-compatible endpoints or custom APIs. To use it, configure your MCP client (Cursor IDE, Claude Desktop, or any MCP client) to point at the ai-council MCP server and supply the appropriate API keys for the models you want engaged. The server is designed to gracefully degrade if one or more models fail, ensuring you still receive a best-effort answer from the available responses.

How to install

Prerequisites:\n- Python 3.10+ installed on your system.\n- uv (The Uvicorn-like runtime for Python) installed.\n\nInstallation steps:\n1) Install uv if not already installed (example for Unix-like systems):\n pip install uv\n\n2) Install the AI Council package (via uvx or pipx as recommended in the README):\n - Using uvx (preferred for uv-based deployment):\n Ensure you have uvx installed, then install/run the server via:\n n/a as per your environment; typically: uvx ai-council\n - Using pipx (recommended for isolated environments):\n pipx run ai-council\n\n3) Alternatively, install via pip (manual install):\n pip install ai-council\n Then run with the appropriate command (as configured in your MCP client):\n ai-council\n\n4) Validate installation by starting the MCP server and verifying it accepts connections from your MCP client.\n\nNotes:\n- If you use pipx, you can update your MCP configuration to call: {"command": "pipx", "args": ["run", "ai-council"]}.\n- For a local uv deployment, ensure you follow the config.yaml or environment variable setup to provide API keys for the models you want to query.

Additional notes

Tips and common issues:\n- Environment variables: Ensure OPENROUTER_API_KEY (and any other model keys you rely on) are set in your MCP config.\n- Parallelism: The default max_models is 3; adjust with CLI arguments such as --max-models to tailor latency and cost.\n- Synthesis model: The synthesizer selects among anonymous responses; you can influence behavior with synthesis_model_selection in a config.yaml if you adopt the advanced configuration.\n- Model availability: If a model is down or unreachable, AI Council will gracefully degrade and still produce a synthesized answer from the remaining models.\n- OpenRouter compatibility: The server is designed to work with OpenRouter and other OpenAI-compatible APIs; ensure your API keys and endpoints are correctly configured.\n- Logging: Increase log level via --log-level (DEBUG/INFO/WARNING/ERROR) to diagnose issues during setup or runtime.

Related MCP Servers

Sponsor this space

Reach thousands of developers