gemini-context
MCP server for Cursor that leverages Gemini's much larger context window to enhance the capabilities of the AI tools
claude mcp add --transport stdio ogoldberg-gemini-context-mcp-server node dist/mcp-server.js \ --env GEMINI_MODEL="gemini-2.0-flash" \ --env GEMINI_API_KEY="your_gemini_api_key_here"
How to use
Gemini Context MCP Server provides a node-based MCP implementation that leverages Gemini's extended 2M token context window for managing and caching large contexts. It offers session-based conversations, semantic search over stored contexts, and automatic cleanup of expired sessions and caches. Clients can interact with the server using the MCP tooling described in the README_MCP and can integrate with popular MCP-enabled clients like Cursor, Claude Desktop, and VS Code. The server exposes tools for context management (generate_text, get_context, clear_context, add_context, search_context) as well as caching operations (create cache, generate with cache, list caches, update TTL, delete cache) to optimize token usage and reduce latency for large prompts or system instructions.
How to install
Prerequisites:
- Node.js v18+ installed
- npm (comes with Node.js)
- Git
Installation steps:
- Clone the repository: git clone https://github.com/ogoldberg/gemini-context-mcp-server
- Change into the project directory: cd gemini-context-mcp-server
- Install dependencies: npm install
- Copy environment variables example: cp .env.example .env
- Add your Gemini API key and model to the .env file: GEMINI_API_KEY=your_api_key_here GEMINI_MODEL=gemini-2.0-flash
- Build the server (if building is required for dist/): npm run build
- Start the server: node dist/mcp-server.js
Optional for development:
- Run in development mode with auto-reload: npm run dev
- Run tests: npm test
Additional notes
Environment variables and configuration options:
- Required: GEMINI_API_KEY (your Gemini API key)
- Optional: GEMINI_MODEL (model name, defaults to gemini-2.0-flash if not set)
- Optional server settings in the README include: MAX_SESSIONS, SESSION_TIMEOUT_MINUTES, MAX_MESSAGE_LENGTH, MAX_TOKENS_PER_SESSION, DEBUG If you plan to persist data across restarts, consider integrating a database for context and cache persistence in future updates. The MCP tooling supports creating and using caches to accelerate responses for large prompts, and there are commands to list, update TTL, and delete caches. Ensure your environment has sufficient memory for caching large contexts (2M token window considerations). Common issues may include network/firewall restrictions for Gemini API access or missing environment variables in the .env file. Verify that GEMINI_API_KEY is valid and that the process has access to the Gemini API endpoints. When building or running under different environments, ensure the dist/mcp-server.js entry point exists and that the build step (npm run build) completes successfully.
Related MCP Servers
iterm
A Model Context Protocol server that executes commands in the current iTerm session - useful for REPL and CLI assistance
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
editor
MCP Server for Phaser Editor
DoorDash
MCP server from JordanDalton/DoorDash-MCP-Server
mcp
MCP сервер для автоматического создания и развертывания приложений в Timeweb Cloud