HydraMCP
Connect agents to agents. MCP server for querying any LLM through your existing subscriptions: compare, vote, and synthesize across GPT, Gemini, Claude, and local models from one terminal.
claude mcp add --transport stdio pickle-pixel-hydramcp npx hydramcp setup
How to use
HydraMCP is an MCP server that connects Claude Code to multiple language models and local runtimes, enabling you to compare, vote, and synthesize responses across providers from a single terminal session. Using HydraMCP, you can list available models, ask a specific model for a prompt, compare several models side-by-side, and even let the system synthesize or reach a consensus across model outputs. The toolset includes commands to query models, compare results, synthesize ideas, analyze files, and recap sessions, all orchestrated through a SmartProvider layer that handles caching, circuit-breaking, and per-model metrics. To start, run the quick setup to configure your API keys, local models, and CLI tools, and then connect HydraMCP to Claude Code so you can issue commands like list_models, ask_model, or compare_models directly from your editor.
Once configured, you can use the built-in tools to explore providers: API-backed providers like OpenAI (GPT-4o, GPT-5), Google Gemini, and Anthropic Claude; CLI-backed tools via subscriptions (Gemini CLI, Claude CLI, Codex CLI); and local models through Ollama. The system supports mixing providers, so you can route certain prompts to local models while using API keys for others, all visible in the combined model list. The tools also support response distillation and caching, so repetitive prompts return faster results and you can tune max_response_tokens to balance detail with conciseness.
How to install
Prerequisites:
- Node.js (LTS version recommended, 14+)
- npm or yarn
- Optional: Claude Code installed for easy integration
Installation steps:
-
Ensure Node.js and npm are installed. Verify:
- node -v
- npm -v
-
Install HydraMCP globally or run locally from the repo:
-
If using npx (recommended for quick start): npx hydramcp setup
-
If cloning the repository for local development: git clone https://github.com/Pickle-Pixel/HydraMCP.git cd HydraMCP npm install npm run build
-
-
Integrate with Claude Code (example): claude mcp add hydramcp -- npx hydramcp
-
Run the server configuration wizard or start using the one-liner: npx hydramcp setup
Notes:
- The setup wizard will detect your API keys and local models, install missing CLIs, and store configuration in ~/.hydramcp/.env for persistence across runs.
- You can also run HydraMCP directly against a pre-built dist/index.js if you build locally, then register with Claude Code as shown above.
Additional notes
Tips and notes:
- HydraMCP auto-detects environment variables like OPENAI_API_KEY, GOOGLE_API_KEY, and ANTHROPIC_API_KEY. Ensure these are set in your shell or in the environment HugoMCP reads them from the .env file created by the setup wizard.
- For local models, install Ollama and pull models (e.g., ollama pull qwen2.5-coder:14b) to enable on-device inference.
- The SmartProvider layer adds circuit breaking, caching (SHA-256 keyed with 15-minute TTL), and per-model metrics, which helps stabilize responses and monitor usage.
- You can route prompts to specific providers by using explicit prefixes (e.g., openai/gpt-5 or ollama/qwen2.5-coder:14b) when listing or querying models.
- If you run into permission or path issues, ensure the path to your dist/index.js (when building locally) is correctly provided during Claude Code integration.
Related MCP Servers
MegaMemory
Persistent project knowledge graph for coding agents. MCP server with semantic search, in-process embeddings, and web explorer.
observee
Observee (YC S25) lets you build AI agents with 1000+ integrations using MCPs with managed OAuth, Security and Observability
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).
gemini-webapi
MCP server for Google Gemini — free image generation, editing & chat via browser cookies. No API keys needed.
mode-manager
MCP Memory Agent Server - A VS Code chatmode and instruction manager with library integration