gemini-llm-council
Multi-LLM consensus extension for Gemini CLI. Inspired by Andrej Karpathy's llm-council.
claude mcp add --transport stdio theerud-gemini-llm-council npx -y gemini-llm-council \ --env OPEN_ROUTER_API_KEY="OpenRouter API Key for Gemini (set via gemini extensions config)"
How to use
Gemini LLM Council is an extension that lets you query multiple top-tier LLMs in parallel, with automated peer review and synthesis. It aggregates insights from several models, applies personas for targeted reviews (such as security or performance), and provides an audit trail via MCP Resources. Use the council to investigate codebases, verify architectural decisions, and generate high-confidence debugging guidance. You can run one-shot consultations, or use the autonomous investigator to have a subagent read files and deliberate before presenting a consolidated answer.
To use it, first configure the extension with your OpenRouter API key and set up your council members either globally or per-project. Then you can issue commands like /council:setup to choose models and reasoning depth, /council:ask to consult on a query, /council:investigate to have the autonomous subagent explore your project, and /council:persona to apply a specific persona (e.g., security, performance) to the review. The system will automatically ground findings to your project metadata and output a summarized report with a recommended fix or next steps.
How to install
Prerequisites:
- Node.js and npm installed on your machine
- Git installed
- Access to the Gemini CLI and an OpenRouter API key
Installation steps:
- Clone or download the repository for the Gemini LLM Council extension.
- Install dependencies:
npm install - Build the extension:
npm run build - Configure your API key for Gemini:
gemini extensions link . gemini extensions config gemini-llm-council "OpenRouter API Key" - Run or integrate the built extension into your environment as appropriate for your setup (the exact runtime path may vary based on your project structure). If you publish to npm, you can also install and run it via npx as shown in the MCP configuration.
Prerequisites recap:
- Node.js (recommended latest LTS)
- npm
- Git
- Gemini CLI and OpenRouter API key
Additional notes
Tips and notes:
- You can store council configuration globally (in your user home) or project-scoped (in your repository) to control which models and reasoning depth are used.
- Personas are customizable; you can add your own in ~/.gemini/extensions/gemini-llm-council/personas.json.
- For large projects, use the autonomous investigate command to let a subagent read files and gather evidence before delivering a final assessment.
- The results can be accessed via MCP Resources through council:// URIs embedded in the summary report for deep-deliberation trails.
- If you encounter API key or network auth issues, verify environment variables and ensure the Gemini CLI configuration matches your OpenRouter account permissions.
Related MCP Servers
systemprompt-code-orchestrator
MCP server for orchestrating AI coding agents (Claude Code CLI & Gemini CLI). Features task management, process execution, Git integration, and dynamic resource discovery. Full TypeScript implementation with Docker support and Cloudflare Tunnel integration.
furi
CLI & API for MCP management
adk-docs-ext
Provides Gemini with up-to-date information about ADK. The documentation content is sourced from llms.txt - either from the official ADK-Docs repo, or one you supply.
ToolsForMCPServer-extension
Simplified Google Workspace Automation with Gemini CLI Extensions
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
CodeRAG
Advanced graph-based code analysis for AI-assisted software development