cognition-wheel
A Model Context Protocol (MCP) server that implements a "wisdom of crowds" approach to AI reasoning by consulting multiple state-of-the-art language models in parallel and synthesizing their responses.
claude mcp add --transport stdio hormold-cognition-wheel npx -y mcp-cognition-wheel \ --env OPENAI_API_KEY="your_openai_key" \ --env ANTHROPIC_API_KEY="your_anthropic_key" \ --env GOOGLE_GENERATIVE_AI_API_KEY="your_google_key"
How to use
The Cognition Wheel MCP server implements a wisdom-of-crowds approach by querying three state-of-the-art language models in parallel and synthesizing their insights. It coordinates three parallel model perspectives—Claude-4-Opus (Anthropic), Gemini-2.5-Pro (Google), and O3 (OpenAI)—each identified only by code names during analysis to reduce bias. After collecting responses, a randomly chosen synthesizer model analyzes all answers and returns a final, comprehensive solution. This design provides robust coverage, potential cross-model validation, and richer reasoning by aggregating diverse viewpoints. The server supports optional internet search, detailed logging, and graceful degradation if one or more models fail, making it suitable for complex reasoning tasks, multi-turn conversations, and scenarios where diverse AI perspectives are valuable. You can use its single tool cognition_wheel with parameters like context, question, and enable_internet_search to tailor the reasoning process to your needs.
To use it with MCP-compatible clients (e.g., Cursor or other MCP tooling), configure the client to point at the Cognition Wheel MCP server. You can run it directly via npx for quick testing or run a local build to host the server yourself. Ensure you provide the necessary API keys for Anthropic, Google, and OpenAI so the server can query all models in parallel and synthesize the results.
How to install
Prerequisites:
- Node.js and npm (or core Node tooling via npx)
- Internet access for API keys if you plan to query external models
Option A: Run with npx (recommended, no installation)
- Ensure Node.js and npm are installed on your system.
- Run the MCP server directly using npx:
npx mcp-cognition-wheel
If you prefer installing globally:
npm install -g mcp-cognition-wheel
mcp-cognition-wheel
Option B: Build from source
- Clone the repository
- Install dependencies using pnpm (as suggested by the project):
pnpm install
- Copy the environment template to a real env file and add your API keys:
cp .env.example .env
- Build the project:
pnpm run build
Option C: Run directly from source after build (if you prefer)
pnpm run start
Additional notes
Environment variables are required for full functionality. Ensure you have API keys for Anthropic, Google Generative AI, and OpenAI and place them in the environment or in your MCP client configuration. When integrating with Cursor or another MCP client, you can either run via npx (remote execution) or point a local node command to the built dist/app.js file. If any model fails, the Cognition Wheel will degrade gracefully and still provide a synthesis using the remaining models. For debugging, enable verbose logging in your environment to capture model responses and synthesis steps. If you encounter path or permission issues with the built dist/app.js, verify that the absolute path is correctly configured in your MCP client (e.g., Cursor) and that the file has execute permissions where appropriate.
Related MCP Servers
iterm
A Model Context Protocol server that executes commands in the current iTerm session - useful for REPL and CLI assistance
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
editor
MCP Server for Phaser Editor
DoorDash
MCP server from JordanDalton/DoorDash-MCP-Server
mcp
MCP сервер для автоматического создания и развертывания приложений в Timeweb Cloud