mindbridge
MindBridge is an AI orchestration MCP server that lets any app talk to any LLM — OpenAI, Anthropic, DeepSeek, Ollama, and more — through a single unified API. Route queries, compare models, get second opinions, and build smarter multi-LLM workflows.
claude mcp add --transport stdio pinkpixel-dev-mindbridge-mcp npx -y @pinkpixel/mindbridge \ --env GOOGLE_API_KEY="GOOGLE_API_KEY_HERE" \ --env OPENAI_API_KEY="OPENAI_API_KEY_HERE" \ --env DEEPSEEK_API_KEY="DEEPSEEK_API_KEY_HERE" \ --env ANTHROPIC_API_KEY="ANTHROPIC_API_KEY_HERE" \ --env OPENROUTER_API_KEY="OPENROUTER_API_KEY_HERE"
How to use
MindBridge is an MCP server that acts as an AI command hub, orchestrating multiple language model providers through a unified OpenAI-compatible API layer. It lets you route prompts to diverse providers (OpenAI, Anthropic, Google, DeepSeek, OpenRouter, Ollama, and OpenAI-compatible services) and supports specialized reasoning models. Core tools include getSecondOpinion to compare responses across models, listProviders to see configured providers and their models, and listReasoningModels for models optimized for complex reasoning. You can configure providers, model defaults, and environment keys via MCP config or environment variables, enabling flexible multi-model workflows and orchestration for agent development, AI backends, and reasoning-heavy tasks. The server can be run in development or production modes, and it exposes an OpenAI-like API surface so existing tools can interact with it with minimal changes.
How to install
Prerequisites:\n- Node.js and npm installed on your system\n- Basic familiarity with npm and environment variables\n\nStep 1: Install the MindBridge MCP server globally (recommended)\nbash\nnpm install -g @pinkpixel/mindbridge\n\nStep 2: Run the server (development)\nbash\npm run dev\n\nStep 3: (Optional) Build and run in production\nbash\nnpm run build\nnpm start\n\nStep 4: If you prefer to use MCP config, create or edit an mcp.json file with your server configuration (see README example) and run the MCP runner as usual using your preferred MCP client or IDE. Prerequisites for options include having an API key for the providers you enable (OpenAI, Anthropic, Google, DeepSeek, OpenRouter, etc.).
Additional notes
Tips and considerations:\n- Set environment variables for the providers you intend to use (OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY, DEEPSEEK_API_KEY, OPENROUTER_API_KEY).\n- If you rely on Ollama or an OpenAI-compatible service, ensure base URLs and models align with your local or hosted instances.\n- The MCP config supports provider defaults and default_params like temperature and reasoning_effort to influence model behavior.\n- Use alwaysAllow to expose specific actions (e.g., getSecondOpinion, listProviders, listReasoningModels) to MCP clients without extra authentication.\n- For Smithery deployment, you can install via Smithery CLI as shown in the README, which helps automate client provisioning.\n- If you switch providers or models, update provider_config and default_params to reflect new capabilities and costs.\n- Ensure network accessibility between the MindBridge MCP server and any local model endpoints (e.g., Ollama at http://localhost:11434).
Related MCP Servers
mcp-ts-template
TypeScript template for building Model Context Protocol (MCP) servers. Ships with declarative tools/resources, pluggable auth, multi-backend storage, OpenTelemetry observability, and first-class support for both local and edge (Cloudflare Workers) runtimes.
MediaWiki
Model Context Protocol (MCP) Server to connect your AI with any MediaWiki
time
⏰ Time MCP Server: Giving LLMs Time Awareness Capabilities
MCPollinations
A Model Context Protocol (MCP) server that enables AI assistants to generate images, text, and audio through the Pollinations APIs. Supports customizable parameters, image saving, and multiple model options.
asterisk
Asterisk Model Context Protocol (MCP) server.
mcp-framework
Rust MCP framework for building AI agents