trellis_mcp
Model Context Protocol(MCP) for TRELLIS(SOTA text-to-3d/image-to-3d) models
claude mcp add --transport stdio fishwowater-trellis_mcp uvx trellis-mcp
How to use
Trellis MCP Server provides a lightweight bridge between MCP-capable assistants and the Trellis 3D/text-to-3D pipeline. The server is configured to run via uvx (Python/uv) and exposes an MCP endpoint named trellis-blender. To use it, ensure you have Trellis and the Trellis Blender addon installed and configured, then start the Trellis MCP server from within Blender as described in the README. Once running, you can send prompts to the MCP endpoint to generate a 3D asset from natural language text using Trellis, and then import or manipulate the resulting mesh inside Blender. The server acts as an interface layer, translating MCP requests into Trellis API calls and returning the resulting asset data back to your assistant or client.
In practice, you will typically:
- Install and run the Trellis API backend locally (or on a server) following the Trellis project guidance.
- Install the Trellis Blender addon and use it to connect to the MCP server by starting the MCP Server from the addon’s TRELLIS panel.
- Configure Windsurf/Cursor/Claude to point to the trellis-blender MCP server (using the uvx command as shown in the config example).
- Send natural language prompts to generate a 3D asset, which Trellis will interpret and return as a mesh that Blender can import or update via the addon.
How to install
Prerequisites:
- Python 3.10+
- Blender (with Trellis Blender addon installed)
- Trellis API backend (local or remote) configured per Trellis repository instructions
- A supported MCP client/wrapper (e.g., Windsurf/Cursor/Claude) to connect to the MCP server
Installation steps:
-
Prepare Trellis backend (API) locally:
- Clone the Trellis repository and install dependencies as per its README.
- Edit trellis_api/config.py to point to your backend URL if needed.
- Start the AI worker and web server: python trellis_api/ai_worker.py --text-workers-per-gpu 1 --image-workers-per-gpu 0 python trellis_api/web_server.py
- (Windows) alternatively, use: python trellis_api/web_server_single.py
-
Install and configure Pinecone/Blender integration (addon):
- In Blender, install the Trellis Blender Addon from GitHub.
- In Blender, open the TRELLIS panel and use the MCP Server option to ensure the server is reachable.
-
Configure the MCP server in Windsurf/Cursor/Claude:
- Use the following mcpServers entry (as shown in the README): { "mcpServers": { "trellis-blender": { "command": "uvx", "args": [ "trellis-mcp" ] } } }
-
Run the MCP server via uvx (as configured):
- Ensure you have the uvx runtime installed and accessible in your environment.
- Start Windsurf/Cursor/Claude and verify that the trellis-blender endpoint is reachable.
Additional notes
Tips and common issues:
- Ensure the Trellis API backend is reachable at the URL configured in trellis_api/config.py; network issues will cause MCP requests to fail.
- The Trellis Blender addon must be installed and enabled in Blender; otherwise, generated assets cannot be imported.
- If you encounter memory issues or long generation times, adjust Trellis worker settings or GPU allocation as described in the Trellis docs.
- The MCP server name trellis-blender is the identifier used by Windsurf/Cursor/Claude to route requests; keep it consistent with your mcp_config.
- If you update the Trellis API backend URL, restart both Trellis services and the MCP server to ensure changes take effect.
- Logging: monitor console output from uvx and Blender addon to diagnose API connection or import issues.
Related MCP Servers
web-eval-agent
An MCP server that autonomously evaluates web applications.
mcp-neo4j
Neo4j Labs Model Context Protocol servers
Gitingest
mcp server for gitingest
fhir
FHIR MCP Server – helping you expose any FHIR Server or API as a MCP Server.
unitree-go2
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.
blender-open
Open Models MCP for Blender Using Ollama