Get the FREE Ultimate OpenClaw Setup Guide →

gemini

An MCP server for interaction with Google's Gemini AI models.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio d-diaa-gemini-mcp uvx --from /path/to/gemini-mcp gemini-mcp \
  --env MODEL_NAME="gemini-2.5-flash-preview-05-20" \
  --env GEMINI_API_KEY="your_api_key_here"

How to use

This MCP server enables Claude to interact with Google's Gemini AI models through a Gemini-specific MCP endpoint. The server exposes tools such as ask_gemini, which sends a user prompt to Gemini and returns Gemini's response, and server_info, which checks the current status of the Gemini integration. To use it, configure Claude's MCP settings with the gemini entry and then call the tools from Claude as you would with other MCP tools. For example, you can Ask Gemini a question or ask Gemini to review code or provide guidance with a specific persona or context. You can also query the server status to verify connectivity and model availability. The integration relies on a Gemini API key and a configured model name to route requests correctly.

How to install

Prerequisites:

  • A Google Gemini API key with access to the Gemini models you intend to use
  • Access to Claude Desktop and the ability to edit its MCP configuration
  • The uvx runtime available on your system (the command shown uses uvx)

Step-by-step installation:

  1. Obtain and install a Gemini API key from Google AI Studio and note the key.
  2. Install the uvx runtime following your environment’s guidance (for example, via your preferred package manager). If using a package manager, obtain a commands similar to: pipx install uvx (or the official uvx install command per the project documentation).
  3. Create or edit Claude's MCP config file (for example, the path shown in the README is ~/Library/Application Support/Claude/claude_desktop_config.json). Add the gemini entry as shown in the example:
{
  "mcpServers": {
    "gemini": {
      "command": "uvx",
      "args": ["--from", "/path/to/gemini-mcp", "gemini-mcp"],
      "env": {
        "GEMINI_API_KEY": "your_api_key_here",
        "MODEL_NAME": "gemini-2.5-flash-preview-05-20"
      }
    }
  }
}
  1. Replace /path/to/gemini-mcp with the actual path to the Gemini MCP directory on your system and insert your API key and preferred model name.
  2. Save the config and restart Claude (or refresh the MCP configuration) to load the Gemini integration.
  3. Use the provided tools (e.g., ask_gemini, server_info) from Claude to interact with Gemini.

Note: Ensure network access to Gemini services and confirm that the API key and model name are correct to avoid authentication or model-not-found errors.

Additional notes

Tips and caveats:

  • Ensure GEMINI_API_KEY is kept secret and not exposed in logs or shared configs.
  • If you encounter API rate limits, adjust request frequency or consider requesting higher quota from Google.
  • The MODEL_NAME should match a supported Gemini model in your account; update if you switch models.
  • Use server_info to verify connectivity before issuing complex prompts.
  • If you see content filtering or safety-related errors, rephrase prompts or adjust the persona/context provided to Gemini.
  • Keep the Gemini MCP directory path up to date in the claude_desktop_config.json when migrating or upgrading the MCP server.
  • If the uvx runtime changes, update the command and arguments accordingly in the config.

Related MCP Servers

Sponsor this space

Reach thousands of developers