gemini-cli
MCP server from centminmod/gemini-cli-mcp-server
claude mcp add --transport stdio centminmod-gemini-cli-mcp-server node server.js \ --env PORT="PORT_NUMBER_FOR_SERVER" \ --env LOG_LEVEL="info" \ --env OPENROUTER_API_KEY="your-openrouter-api-key" \ --env GEMINI_CLI_ENDPOINT="http://localhost:PORT" \ --env OPENROUTER_API_BASE_URL="https://api.openrouter.ai/v1"
How to use
Gemini CLI MCP Server exposes a bridge between the Gemini CLI and MCP-compatible clients (such as Claude Code/Desktop). It also integrates OpenRouter for access to 400+ AI models and provides 33 specialized tools to streamline multi-AI workflows. To use it, start the server with the environment variables configured (notably the OpenRouter API key). Once running, MCP clients can connect to the server and invoke Gemini CLI tools like gemini_prompt for analyzing code, performing comprehensive reviews, or engaging in collaborative AI workflows. Users can leverage tools such as gemini_prompt to request deep code analysis, gemini_ai_collaboration to orchestrate debates between models, and OpenRouter-backed models for expanded capabilities. The server is designed to work with Claude Code, Claude Desktop, and other MCP-compliant clients, enabling seamless multi-model interactions through the MCP interface.
How to install
Prerequisites:
- Node.js (v16+ recommended) and npm
- Git
- Access to an OpenRouter API key if you plan to use 400+ model integrations
-
Clone the repository git clone https://github.com/centminmod/gemini-cli-mcp-server.git cd gemini-cli-mcp-server
-
Install dependencies npm install
-
Configure environment
- Create a .env file or export environment variables: OPENROUTER_API_KEY=your-openrouter-api-key OPENROUTER_API_BASE_URL=https://api.openrouter.ai/v1 PORT=3000
- Ensure the Gemini CLI endpoint (if applicable) points to your runtime
-
Run the server node server.js
or if you have a script defined in package.json: npm run start
-
Verify
- The server should start and listen on the configured port
- MCP clients can connect using the server hostname and port
Additional notes
Tips and common considerations:
- Ensure your OpenRouter API key is valid and has the necessary model access to avoid rate limits.
- Check logs (set LOG_LEVEL to debug for troubleshooting) to diagnose connection or authentication issues.
- If running behind a proxy or in a container, expose the PORT correctly and configure any required CORS or network policies.
- The mcp_config example uses a generic path (server.js); adjust the path if your entry point differs (e.g., dist/server.js).
- When evaluating tools like gemini_prompt or gemini_ai_collaboration, be mindful of rate limits and model availability on the OpenRouter side.
- For production deployments, consider containerizing the server and securely managing API keys with secret managers.