nanobanana
MCP server from zhongweili/nanobanana-mcp-server
claude mcp add --transport stdio zhongweili-nanobanana-mcp-server uvx nanobanana-mcp-server@latest \ --env GEMINI_API_KEY="your-gemini-api-key-here"
How to use
Nano Banana MCP Server provides AI-powered image generation using Google's Gemini models with intelligent automatic model selection. By default it routes prompts to NB2 (Gemini 3.1 Flash Image) for fast, high-quality results, while allowing access to Gemini Pro when needed for more complex scenes. The server exposes MCP tools such as generate_image, edit_image, and upload_file, enabling you to generate art, refine outputs, and manage assets directly from your MCP client or IDE integrations. You can integrate the server into your Claude, Cursor, OpenAI Codex, or other MCP-enabled workflows to produce production-ready visuals with grounded knowledge and consistent subjects.
To use the tools, start the server using your preferred MCP client (e.g., stdio via uvx, or Python-based execution). Then call generate_image with your prompt and optional templates, edit_image to modify existing outputs, or upload_file to attach supporting assets. The system handles model selection, prompt templating, and file management behind a single MCP endpoint, simplifying complex image generation pipelines across your projects.
How to install
Prerequisites
- A Gemini API Key (required for NB2/Gemini access): obtain at https://makersuite.google.com/app/apikey
- Python 3.11+ (for development or local tooling)
- Node.js/npm not required if using uvx/uv pipelines; a Node-based setup is not documented for this server
Installation (recommended path) Option 1: Install via MCP Registry (recommended)
- Ensure you have an MCP client installed.
- Deploy nanobanana-mcp-server from the MCP registry using your client.
Option 2: Using uvx (recommended for this server)
uvx nanobanana-mcp-server@latest
Option 3: Using pip (if you prefer Python packaging)
pip install nanobanana-mcp-server
Configuration (example) You can configure the MCP server in your Claude/VS Code/Cursor/OpenWebUI environments. Here is a representative configuration to run the server with uvx and an API key:
{
"mcpServers": {
"nanobanana": {
"command": "uvx",
"args": ["nanobanana-mcp-server@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key-here"
}
}
}
}
Prerequisites recap
- gemini API key and the corresponding method you plan to use (API Key or Vertex AI ADC)
- Access to a Python 3.11+ environment if running development tooling or local server
- A compatible MCP client to start and manage the server
Additional notes
Tips and common issues:
- Set GEMINI_API_KEY to your Gemini API key for NB2 and Pro usage. If using Vertex AI, consider NANOBANANA_AUTH_METHOD=vertex_ai and provide GCP_PROJECT_ID and GCP_REGION as needed.
- When upgrading, the nanobanana-mcp-server@latest tag will pull the newest features, including model selection improvements and fixes.
- If you encounter configuration issues, ensure your environment variables are exported in the shell or provided in your MCP config JSON exactly as shown.
- For Claude/Code integrations, keep your config consistent across tools to avoid TOML/JSON syntax errors that could affect all integrations.
- NB2 is the default model for fast, production-ready outputs up to 4K resolution; Pro offers maximum reasoning depth for complex compositions.
- You can control output dimensions via the prompt templates or the model’s configuration to enforce aspect ratios such as 1:1, 16:9, 9:16, 21:9, etc.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP