Get the FREE Ultimate OpenClaw Setup Guide →

modelscope

ModelScope's official MCP Server (in active development).

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio modelscope-modelscope-mcp-server uvx modelscope-mcp-server \
  --env MODELSCOPE_API_TOKEN="your-api-token"

How to use

ModelScope MCP Server exposes ModelScope’s rich ecosystem as MCP-compatible tools within an MCP client. It provides capabilities for AI image generation (text-to-image and image-to-image), resource discovery (models, datasets, studios/apps, papers, and other MCP servers) with filtering, and retrieving detailed resource information. You can also anticipate forthcoming features like documentation search and Gradio API integration, along with contextual information about the current user and environment. To use it, configure your MCP client to point at the server and supply your ModelScope API token. The server can run locally via uvx, or you can deploy it using Docker. Transport options include standard stdio, HTTP, and HTTP/SSE, depending on your client and deployment choice. You can also inspect and experiment with the server via the MCP Inspector tool for an interactive exploration of tools and resources.

How to install

Prerequisites:

  • Python 3.8+ (recommended) and pip
  • Optional: Docker if you prefer containerized deployment

Install from PyPI (Python):

pip install modelscope-mcp-server

Run locally with the MCP client (example using uvx):

uvx run modelscope-mcp-server

If you prefer Docker:

docker run --rm -i -e MODELSCOPE_API_TOKEN=your-api-token ghcr.io/modelscope/modelscope-mcp-server

Environment variables:

  • MODELSCOPE_API_TOKEN: Your ModelScope API token (required for API access)

Examples of how to configure the MCP client (JSON):

  • Local, stdio transport (uvx):
{
  "mcpServers": {
    "modelscope-mcp-server": {
      "command": "uvx",
      "args": ["modelscope-mcp-server"],
      "env": {
        "MODELSCOPE_API_TOKEN": "your-api-token"
      }
    }
  }
}
  • Docker deployment (example):
{
  "mcpServers": {
    "modelscope-mcp-server": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "MODELSCOPE_API_TOKEN",
        "ghcr.io/modelscope/modelscope-mcp-server"
      ],
      "env": {
        "MODELSCOPE_API_TOKEN": "your-api-token"
      }
    }
  }
}

Additional notes

Tips and caveats:

  • Ensure MODELSCOPE_API_TOKEN is set in your environment when running the server; without it, API calls to ModelScope may fail.
  • The server supports multiple transports; choose stdio for local testing and HTTP/HTTP-SSE for web integrations depending on your MCP client.
  • When running via Docker, you can map ports if needed and pass the API token as an environment variable.
  • If you encounter token or authentication issues, regenerate tokens from ModelScope and update your configuration.
  • Refer to MCP JSON Configuration Standard for compatibility with other MCP clients and tooling.

Related MCP Servers

Sponsor this space

Reach thousands of developers