Get the FREE Ultimate OpenClaw Setup Guide →

aistudio

Google AI Studio MCP Server - Powerful Gemini API integration for Model Context Protocol with multi-modal file processing, PDF-to-Markdown conversion, image analysis, and audio transcription capabilities. Supports all Gemini 2.5 models with comprehensive file format support.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio eternnoir-aistudio-mcp-server npx -y aistudio-mcp-server \
  --env GEMINI_MODEL="gemini-2.5-flash" \
  --env GEMINI_API_KEY="your_api_key_here" \
  --env GEMINI_TIMEOUT="600000" \
  --env GEMINI_MAX_FILES="10" \
  --env GEMINI_TEMPERATURE="0.2" \
  --env GEMINI_MAX_OUTPUT_TOKENS="16384" \
  --env GEMINI_MAX_TOTAL_FILE_SIZE="50"

How to use

AI Studio MCP Server exposes a generation toolset that integrates with the Google AI Studio / Gemini API. It supports content generation with optional file inputs, conversation history, and system prompts to guide the model. The primary tool is generate_content, which accepts a user_prompt, an optional system_prompt, and an optional files array (with path or base64 content). The server supports a variety of file types (images, PDFs, office documents, text) and can combine those inputs with configurable Gemini models, temperature, and output token limits. To use it, configure your GEMINI_API_KEY as an environment variable, select a Gemini model if you need a different flavor, and then invoke the tool through the MCP client or your frontend that talks to the MCP server. The MCP configuration shown here is ready to run via npx and includes sensible defaults that you can override at runtime.

How to install

Prerequisites:

  • Node.js 20.0.0 or higher
  • Access to Google Gemini / AI Studio API with an API key (GEMINI_API_KEY)

Install and run using npx (recommended):

# Ensure you have your API key ready
export GEMINI_API_KEY=your_api_key_here

# Run the MCP server via npx
npx -y aistudio-mcp-server

Alternatively, install globally with npm and run locally:

# Install globally
npm install -g aistudio-mcp-server

# Start the server (you may set the API key in the environment as well)
GEMINI_API_KEY=your_api_key_here aistudio-mcp-server

Optional: you can set additional environment variables as described in the configuration section to tailor model, timeout, token limits, and file handling.

Additional notes

Notes and tips:

  • Ensure GEMINI_API_KEY is kept secret and not committed to version control.
  • The defaults assume gemini-2.5-flash; adjust GEMINI_MODEL as needed for different capabilities.
  • GEMINI_MAX_OUTPUT_TOKENS, GEMINI_MAX_FILES, and GEMINI_MAX_TOTAL_FILE_SIZE help protect against oversized requests.
  • When using file inputs, be mindful of size limits per file and overall request size restrictions described in the documentation. Base64 content can be used in the files array if you provide content instead of a path.
  • If you encounter timeouts, consider increasing GEMINI_TIMEOUT and/or lowering the requested output complexity.
  • The MCP config shown here is compatible with the provided example MCP client configuration; you can adapt env variables per deployment environment.

Related MCP Servers

Sponsor this space

Reach thousands of developers