genai
Nano Banana MCP Server: Free Gemini/Tongyi Image Gen MCP Server for AI Workflows
claude mcp add --transport stdio adamydwang-genai-mcp docker run -i adamydwang/genai-mcp
How to use
This MCP server provides a streamable HTTP endpoint for GenAI-powered image generation and editing, wired to multiple providers (Google Gemini, Tongyi Wanxiang, and APIMart). Once running, clients can send MCP protocol requests to generate or edit images using the underlying GenAI backends. The server exposes tools for Gemini, Wanxiang, and APIMart integrations, allowing you to perform synchronous or asynchronous image generation workflows depending on the provider and model you choose. If you configure the image output as a URL, the server will upload generated images to your OSS/S3-compatible storage and return accessible links; otherwise, it returns data URIs (base64).
How to install
Prerequisites:\n- Docker or a compatible container runtime (recommended)\n- A GenAI provider API key and access to the chosen backend (Gemini, Wanxiang, or APIMart)\n- Optional: OSS/S3 credentials if you want URL outputs and remote storage\n\n1) Prepare configuration\n- Create a configuration file or set environment variables as described in the README (GenAI provider, base URL, model names, timeouts, image format, server address/port, OSS settings if needed).\n2) Pull and run the MCP server (Docker)\n- This example uses a Docker image name placeholder. If you have a local build, replace with your image/tag.\n\nbash\ndocker pull adamydwang/genai-mcp || true\n# Run the MCP server container (adjust environment as needed)\ndocker run -d --name genai-mcp \ -p 8080:8080 \ -e GENAI_PROVIDER=gemini \ -e GENAI_BASE_URL=https://generativelanguage.googleapis.com \ -e GENAI_API_KEY=your_api_key_here \ -e GENAI_GEN_MODEL_NAME=gemini-3-pro-image-preview \ -e GENAI_EDIT_MODEL_NAME=gemini-3-pro-image-preview \ -e GENAI_TIMEOUT_SECONDS=120 \ -e GENAI_IMAGE_FORMAT=base64 \ -e SERVER_ADDRESS=0.0.0.0 \ -e SERVER_PORT=8080 \ adamydwang/genai-mcp\n\n3) Alternatively, build from source and run locally (Go-based server)\n- Install Go 1.21+ and build:\n\nbash\ngo version\n# ensure 1.21+ is installed\ngo env -w GO111MODULE=on\ngo build .\n./genai-mcp\n\n4) Verify the MCP endpoint\n- Open http://127.0.0.1:8080/mcp in a compatible MCP client and start sending requests.\n
Additional notes
Tips and common issues: - Ensure your GENAI_BASE_URL and API keys are correctly set for the chosen provider; some providers may require additional authentication headers. - If you configure GENAI_IMAGE_FORMAT=url, make sure OSS_ENDPOINT and OSS_BUCKET are correctly set and that your bucket policy permits public read access as needed. - For Wanxiang/APIMart, the toolset operates asynchronously; you may need to poll for task completion using the provided query endpoints. - If you encounter CORS or streaming issues, verify your MCP client supports streamable HTTP transport as described in the server docs. - When using OSS/S3 for image storage, monitor permissions and bucket lifecycle policies to avoid unauthorized access or unexpected deletions.
Related MCP Servers
ai-guide
程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站
unity
Unity MCP acts as a bridge, allowing AI assistants (like Claude, Cursor) to interact directly with your Unity Editor via a local MCP (Model Context Protocol) Client. Give your LLM tools to manage assets, control scenes, edit scripts, and automate tasks within Unity.
dbhub
Zero-dependency, token-efficient database MCP server for Postgres, MySQL, SQL Server, MariaDB, SQLite.
coplay-unity-plugin
Unity plugin for Coplay
CodeMCP
Code intelligence for AI assistants - MCP server, CLI, and HTTP API with symbol navigation, impact analysis, and architecture mapping
api
MCP server from hostinger/api-mcp-server