modelscope
ModelScope's official MCP Server (in active development).
claude mcp add --transport stdio modelscope-modelscope-mcp-server uvx modelscope-mcp-server \ --env MODELSCOPE_API_TOKEN="your-api-token"
How to use
ModelScope MCP Server exposes ModelScope’s rich ecosystem as MCP-compatible tools within an MCP client. It provides capabilities for AI image generation (text-to-image and image-to-image), resource discovery (models, datasets, studios/apps, papers, and other MCP servers) with filtering, and retrieving detailed resource information. You can also anticipate forthcoming features like documentation search and Gradio API integration, along with contextual information about the current user and environment. To use it, configure your MCP client to point at the server and supply your ModelScope API token. The server can run locally via uvx, or you can deploy it using Docker. Transport options include standard stdio, HTTP, and HTTP/SSE, depending on your client and deployment choice. You can also inspect and experiment with the server via the MCP Inspector tool for an interactive exploration of tools and resources.
How to install
Prerequisites:
- Python 3.8+ (recommended) and pip
- Optional: Docker if you prefer containerized deployment
Install from PyPI (Python):
pip install modelscope-mcp-server
Run locally with the MCP client (example using uvx):
uvx run modelscope-mcp-server
If you prefer Docker:
docker run --rm -i -e MODELSCOPE_API_TOKEN=your-api-token ghcr.io/modelscope/modelscope-mcp-server
Environment variables:
- MODELSCOPE_API_TOKEN: Your ModelScope API token (required for API access)
Examples of how to configure the MCP client (JSON):
- Local, stdio transport (uvx):
{
"mcpServers": {
"modelscope-mcp-server": {
"command": "uvx",
"args": ["modelscope-mcp-server"],
"env": {
"MODELSCOPE_API_TOKEN": "your-api-token"
}
}
}
}
- Docker deployment (example):
{
"mcpServers": {
"modelscope-mcp-server": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "MODELSCOPE_API_TOKEN",
"ghcr.io/modelscope/modelscope-mcp-server"
],
"env": {
"MODELSCOPE_API_TOKEN": "your-api-token"
}
}
}
}
Additional notes
Tips and caveats:
- Ensure MODELSCOPE_API_TOKEN is set in your environment when running the server; without it, API calls to ModelScope may fail.
- The server supports multiple transports; choose stdio for local testing and HTTP/HTTP-SSE for web integrations depending on your MCP client.
- When running via Docker, you can map ports if needed and pass the API token as an environment variable.
- If you encounter token or authentication issues, regenerate tokens from ModelScope and update your configuration.
- Refer to MCP JSON Configuration Standard for compatibility with other MCP clients and tooling.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
ultimate_mcp_server
Comprehensive MCP server exposing dozens of capabilities to AI agents: multi-provider LLM delegation, browser automation, document processing, vector ops, and cognitive memory systems
MCP-PostgreSQL-Ops
🔍Professional MCP server for PostgreSQL operations & monitoring: 30+ extension-independent tools for performance analysis, table bloat detection, autovacuum monitoring, schema introspection, and database management. Supports PostgreSQL 12-17.
semantic-scholar-fastmcp
A FastMCP server implementation for the Semantic Scholar API, providing comprehensive access to academic paper data, author information, and citation networks.