blender-open
Open Models MCP for Blender Using Ollama
claude mcp add --transport stdio dhakalnirajan-blender-open-mcp python src/blender_open_mcp/server.py \ --env HOST="0.0.0.0" \ --env PORT="8000" \ --env OLLAMA_URL="http://localhost:11434" \ --env OLLAMA_MODEL="llama3.2"
How to use
blender-open-mcp integrates Blender with local AI models via Ollama using the Model Context Protocol (MCP). This lets you control Blender with natural language prompts by sending structured MCP commands to a locally running Ollama-backed model. The server exposes an API (default http://0.0.0.0:8000) and is designed to work with a Blender add-on that provides a user interface for issuing prompts, plus optional PolyHaven asset integration. Tools available range from querying scene information to creating and modifying objects, applying materials, rendering images, executing Python in Blender, and managing PolyHaven assets. You can also adjust the Ollama model and URL via dedicated commands so you can iterate with different AI models locally.
How to install
Prerequisites:\n- Blender 3.0 or later\n- Ollama installed and running (https://ollama.com/)\n- Python 3.10 or later\n- uv (https://uv.liamwhiteweb.dev/) to manage virtual environments\n- Git installed to clone the repository\n\nInstallation steps:\n1) Clone the repository and navigate into it:\nbash\ngit clone https://github.com/dhakalnirajan/blender-open-mcp.git\ncd blender-open-mcp\n\n2) Create and activate a virtual environment (recommended):\nbash\nuv venv\nsource .venv/bin/activate # Linux/macOS\n.venv\Scripts\activate # Windows\n\n3) Install Python dependencies in editable mode:\nbash\nuv pip install -e .\n\n4) Install and enable the Blender add-on:\n- Open Blender.\n- Edit -> Preferences -> Add-ons.\n- Click Install... and select addon.py from the blender-open-mcp directory.\n- Enable the "Blender MCP" add-on.\n\n5) Download an Ollama model if needed (e.g., llama3.2):\nbash\nollama run llama3.2\n\n6) Start Ollama, then start the MCP server:\nbash\nblender-mcp # default host 0.0.0.0:8000\n\nOr run directly:\nbash\npython src/blender_open_mcp/server.py\n```
Additional notes
Notes and tips:\n- The server defaults to http://0.0.0.0:8000. You can override host/port and Ollama settings by passing environment variables (OLLAMA_URL, OLLAMA_MODEL, HOST, PORT) in the mcp_config.\n- Ensure Ollama is running and that the selected model is installed locally.\n- If Blender add-on UI fails to connect, verify that the MCP server is running and that Blender can reach the server URL.\n- PolyHaven integration is optional; you can enable asset-related tools and use prompts to download textures/materials via the PolyHaven tools provided.\n- Common issues include mismatched Ollama model names, port conflicts, or Blender add-on not syncing prompts; check the logs for error details and confirm the server is reachable via the host/port you configured.
Related MCP Servers
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
Gitingest
mcp server for gitingest
microsoft_fabric_mcp
MCP server wrapping around the Fabric Rest API
mcp -memos-py
A Python package enabling LLM models to interact with the Memos server via the MCP interface for searching, creating, retrieving, and managing memos.
mcp -python-template
This template provides a streamlined foundation for building Model Context Protocol (MCP) servers in Python. It's designed to make AI-assisted development of MCP tools easier and more efficient.