gemini -client
A MCP (Model Context Protocol) client that uses Google Gemini AI models for intelligent tool usage and conversation handling. Tested working nicely with Claude Desktop as an MCP Server currently. Based on untested AI gen code by a non-coder use at own risk.
claude mcp add --transport stdio angrysky56-gemini-mcp-client uv --directory /home/ty/Repositories/ai_workspace/gemini-mcp-client run python servers/gemini_mcp_server.py \ --env GEMINI_API_KEY="your_gemini_api_key_here"
How to use
This MCP server is a Gemini-based client that acts as an intermediary between Gemini AI models and MCP tooling. It supports multiple Gemini models and can discovery and execute tools provided by Gemini packages (such as gemini-tool-agent, google-generativeai, or google-genai). You can manage server configurations, switch models at runtime, and interact with configured servers via the MCP CLI (mcp-gemini-client). The client can start a chat session against a configured server, query available tools, and call tools directly. It also provides features for exporting Claude Desktop configurations and for programmatic usage via a Python MCPClient. To begin, configure a Gemini MCP server in your environment, ensure you have a Gemini API key, and then start a chat session or query tool capabilities using the provided commands. The client supports changing models on the fly during a session and listing available models to guide tool usage.
How to install
Prerequisites:
- Python 3.8+ and a virtual environment tool (venv or uv) installed on your system.
- Access to Gemini API keys if you plan to use Gemini models.
- Basic familiarity with the UV (Python dependency management) workflow as shown in the repository.
Installation steps:
-
Install UV (if not already installed) and create a virtual environment: uv venv --python 3.12 --seed source .venv/bin/activate
-
Install Gemini packages as needed (example with gemini-tool-agent, google-generativeai, or google-genai): uv add gemini-tool-agent uv add google-generativeai
or
uv add google-genai
-
Install optional development dependencies if you plan to contribute: uv add --dev ".[dev]" uv run pre-commit install
-
Prepare configuration files for Claude Desktop or your MCP client as described in the README, including environment variables like GEMINI_API_KEY.
-
Run or connect to your MCP server using the provided mcp-gemini-client tooling or your configured UV-based commands.
Additional notes
Tips and common issues:
- Ensure your GEMINI_API_KEY is set in the environment where the MCP server runs; the example config uses a placeholder value.
- If you modify paths in the config, keep them accurate to your environment (the provided example uses a specific home directory path).
- When switching models at runtime, verify that the current server supports dynamic model switching and update the client state accordingly.
- Exporting and importing Claude Desktop configurations can enable easy transfers between environments; check compatibility of exported configs with your Claude Desktop version.
- If you encounter dependency or package conflicts, start with a clean virtual environment and reinstall the required Gemini packages.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
jupyter
🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
mcp-python-interpreter
MCP Python Interpreter: run python code. Python-mcp-server, mcp-python-server, Code Executor
unity
A Unity MCP server that allows MCP clients like Claude Desktop or Cursor to perform Unity Editor actions.
google-search-console
It connects directly to your Google Search Console account via the official API, letting you access key data right from AI tools like Claude Desktop or OpenAI Agents SDK and others .
coder_db
An intelligent code memory system that leverages vector embeddings, structured databases, and knowledge graphs to store, retrieve, and analyze code patterns with semantic search capabilities, quality metrics, and relationship modeling. Designed to enhance programming workflows through contextual recall of best practices, algorithms, and solutions.