z-image-studio
A Cli, a webUI, and a MCP server for the Z-Image-Turbo text-to-image generation model (Tongyi-MAI/Z-Image-Turbo base model as well as quantized models)
claude mcp add --transport stdio iconben-z-image-studio uvx git+https://github.com/iconben/z-image-studio.git \ --env ZIMAGE_CONFIG="path/to/config.json (optional)" \ --env ZIMAGE_OUTPUT_DIR="path/to/output (optional)" \ --env ZIMAGE_ENABLE_TORCH_COMPILE="1 to enable torch.compile (experimental, requires compatible setup)"
How to use
Z-Image Studio provides an MCP server that exposes image generation tools, model listing, and generation history over the Model Context Protocol. Once installed, you can run the MCP server to allow local or remote agents to request image generations, query available models, and fetch generation history via the /mcp endpoint. The server supports multiple transport modes (stdio, SSE, and Streamable HTTP) to maximize compatibility and performance, with the recommended approach being Streamable HTTP at /mcp for production use, and SSE at /mcp-sse as a fallback.
Using the MCP features, agents can request image generations with associated parameters, list supported models for the current hardware, and retrieve historical results to track outputs. The CLI and Web UI provided by Z-Image Studio remain available for interactive use, but the MCP server enables integration with automation pipelines and AI agents by exposing a standardized content format across transports.
How to install
Prerequisites:
- Python 3.11+ and uv (uvt or uv runner) installed
- Optional GPU drivers supporting CUDA/ROCm/MPS as per your platform
Installation steps:
- Install uv tool if you don’t have it:
- pipx install uv
- or using uv directly if you have it installed via uv tool
- Install the Z-Image Studio MCP server via uv tool install:
uv tool install git+https://github.com/iconben/z-image-studio.git
or clone locally and install from source, then install the tool
- Start the MCP server (stdio + SSE + Streamable HTTP):
- zimg mcp
- or if you installed as a local repo, run the equivalent entrypoint, e.g. zimg-mcp
- Verify the server is running by hitting the MCP endpoint:
- Streamable HTTP: http://localhost:8000/mcp
- SSE: http://localhost:8000/mcp-sse
Notes:
- The MCP server exposes endpoints for image generation, model listing, and history via a standardized transport protocol.
- If you want to customize behavior (output paths, torch compile, etc.), set environment variables or a config JSON as indicated in the environment variables section of the config.
Additional notes
Tips and common issues:
- Ensure Python 3.11+ is used; Python 3.12+ may require experimental flags for torch.compile.
- If the server cannot detect a suitable GPU, it will fall back to CPU; check your PyTorch installation and device visibility.
- For AMD ROCm setups, verify ROCm drivers are installed and PyTorch is built with ROCm support; environment hints like HSA_OVERRIDE_GFX_VERSION may be needed for some GPUs.
- The MCP transport is designed to be transport-agnostic; prefer Streamable HTTP (/mcp) for best performance, falling back to SSE (/mcp-sse).
- You can control output directories and configuration via ZIMAGE_OUTPUT_DIR and ZIMAGE_CONFIG env vars.
Related MCP Servers
apple-books
Apple Books MCP Server
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI
google-search-console
It connects directly to your Google Search Console account via the official API, letting you access key data right from AI tools like Claude Desktop or OpenAI Agents SDK and others .
Youtube
YouTube MCP Server is an AI-powered solution designed to revolutionize your YouTube experience. It empowers users to search for YouTube videos, retrieve detailed transcripts, and perform semantic searches over video content—all without relying on the official API. By integrating with a vector database, this server streamlines content discovery.
coder_db
An intelligent code memory system that leverages vector embeddings, structured databases, and knowledge graphs to store, retrieve, and analyze code patterns with semantic search capabilities, quality metrics, and relationship modeling. Designed to enhance programming workflows through contextual recall of best practices, algorithms, and solutions.
astrograph
An MCP server with tools to stop AI agents from writing duplicate code. Fixes legacy code using highly efficient algorithms.