Get the FREE Ultimate OpenClaw Setup Guide →

mcp-gemini-google-search

MCP server for Google Search integration using Gemini's built-in search capabilities

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio yukukotani-mcp-gemini-google-search npx -y mcp-gemini-google-search \
  --env GEMINI_MODEL="gemini-2.5-flash" \
  --env GEMINI_API_KEY="Your Gemini API key (for Google AI Studio) or leave unset when using Vertex AI" \
  --env GEMINI_PROVIDER="vertex" \
  --env VERTEX_LOCATION="us-central1" \
  --env VERTEX_PROJECT_ID="your-gcp-project-id"

How to use

This MCP server exposes Gemini-powered Google Search capabilities through the Model Context Protocol (MCP). It uses Gemini's Grounding with Google Search feature to fetch real-time web results and citations, making it suitable for grounding model outputs with up-to-date information. The server supports standard MCP tooling and can be driven via stdio transport, CLI, or Claude integration, enabling you to add a search-enabled context to conversations or workflows. Tools provided include google_search, which accepts a query and returns search results with sources you can cite in your responses.

How to install

Prerequisites:

  • Node.js 18 or later
  • npm

Installation steps:

  1. Install the MCP server globally via npm: npm install -g mcp-gemini-google-search

  2. Ensure you have a Gemini API key or Vertex AI setup configured as described in the README (see environment variables in the usage section).

  3. Run the MCP server using the recommended command (as configured in mcp_config): npx mcp-gemini-google-search

  4. If you need to customize environment variables for Google AI Studio or Vertex AI, export them in your shell prior to starting the server, for example: export GEMINI_API_KEY="your-api-key-here" export GEMINI_MODEL="gemini-2.5-flash" export GEMINI_PROVIDER="vertex" export VERTEX_PROJECT_ID="your-gcp-project-id" export VERTEX_LOCATION="us-central1"

Additional notes

Notes and tips:

  • If you use Vertex AI, set GEMINI_PROVIDER to vertex and provide VERTEX_PROJECT_ID and VERTEX_LOCATION. For Google AI Studio, set GEMINI_API_KEY and optionally GEMINI_MODEL.
  • The default Gemini model is gemini-2.5-flash; adjust GEMINI_MODEL if you need a different variant.
  • When integrating with Claude Code, you can register the MCP instance and reference it with the npx command shown in usage examples.
  • Ensure your Google API access complies with the terms of service and that your API key is kept secret. Watch for quota limits on API usage.
  • The google_search tool expects a query string and will return results with citations suitable for grounding responses.

Related MCP Servers

Sponsor this space

Reach thousands of developers