Get the FREE Ultimate OpenClaw Setup Guide →

brainstorm

MCP server for multi-round AI brainstorming debates between multiple models (GPT, DeepSeek, Groq, Ollama, etc.)

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio spranab-brainstorm-mcp npx -y brainstorm-mcp \
  --env GROQ_API_KEY="gsk_..." \
  --env GEMINI_API_KEY="AIza..." \
  --env OPENAI_API_KEY="sk-..." \
  --env DEEPSEEK_API_KEY="sk-..."

How to use

brainstorm-mcp runs multi-round debates between multiple AI models, with Claude as an active participant in every round. It orchestrates concurrent model responses, enforces per-model timeouts, and provides a structured synthesis at the end. You interact with the system by initiating brainstorm sessions via the brainstorm command, optionally selecting specific providers and models, and by feeding Claude its response through brainstorm_respond during interactive sessions. The tool supports a variety of providers (OpenAI, Gemini, DeepSeek, Groq, and local Ollama models) and can be configured with environment variables or a JSON config file to specify models, API keys, and base URLs. Claude participates across rounds, reads the other models’ outputs, and contributes its own perspective before the next round so you get diverse viewpoints and a consolidated synthesis.

How to install

Prerequisites:

  • Node.js and npm installed on your system
  • Access keys for the supported providers (OpenAI, Gemini, DeepSeek, Groq) if you plan to use their APIs
  • Optional: Ollama or other local models if you want to run local providers

Installation steps:

  1. Clone the repository (or install the MCP package globally):
git clone https://github.com/spranab/brainstorm-mcp.git
cd brainstorm-mcp
npm install
  1. Build (if required by the project) and start the MCP server:
npm run build
npm start
  1. Run via npx (as shown in the Quick Start):
npx -y brainstorm-mcp
  1. Configure your client (Claude Code or Claude Desktop) to point to the brainstorm server name and provide the necessary API keys via environment variables or a config file as described in the documentation.

Additional notes

Tips and common considerations:

  • Ensure your API keys for OpenAI, Gemini, DeepSeek, Groq are correctly exported in your environment or configured in the JSON config file prior to starting the server.
  • Use the deploy-ready mcp configuration by placing the same environment variables in your runtime environment to avoid missing credentials.
  • The system supports per-model timeouts (2 minutes per API call by default); if a model is slow, others will continue, and the synthesizer will produce the final output.
  • Interactive sessions enable Claude to participate in every round; if you want external models only, you can run a non-interactive session by setting participate=false.
  • Synthesize using the designated synthesizer model after all rounds; you can override this in your configuration if needed.
  • If you run into baseURL issues with local models (Ollama) or provider autodetection, double-check the provider section in your config file and ensure the corresponding services are up and reachable.

Related MCP Servers

Sponsor this space

Reach thousands of developers