Fusion
Remote control solution for Fusion 360 using client-server architecture. Features JSON socket communication, modular design with Python server and client library, plus Fusion 360 add-in. Enables command monitoring, remote execution, and model data retrieval for automation workflows.
claude mcp add --transport stdio joelalbon-fusion-mcp-server python server.py \ --env OPENAI_API_KEY="API key for OpenAI (optional for LLM features)"
How to use
Fusion is an MCP server that bridges external applications with Autodesk Fusion 360. The server listens for JSON-formatted messages over TCP and interacts with the Fusion 360 API via a dedicated add-in. The project provides three components: the MCP server (server.py) which handles client connections and forwards commands, a Python client library (client.py) for easy integration in Python applications, and a Fusion 360 add-in (fusion360_mcp_addin.py) that runs inside Fusion 360 and performs the actual API calls. Typical workflows include starting the server, connecting a client to the server, and issuing commands to manipulate Fusion 360 models or query information.
To use Fusion MCP, first run the server, then connect a client to the server host and port. The client supports operations such as requesting model information (get_model_info) and executing Fusion 360 commands (e.g., create_circle) by sending a fusion_command message. In Fusion 360, ensure the MCP add-in is loaded and connected to the server to enable remote automation. If you enable LLM features, the server can forward llm_request messages to the configured OpenAI API for natural language processing tasks related to design automation.
How to install
Prerequisites:
- Python 3.6+ installed on the machine running the server
- Optional: OpenAI Python package if you plan to use LLM features
Installation steps:
-
Clone the repository or copy the server and client files to your desired location.
-
Install Python (3.6+) if not already installed.
-
(Optional) Install the OpenAI package to enable LLM features:
pip install openai
-
Run the server:
python server.py
-
Follow the Fusion 360 add-in installation steps in the README to deploy the add-in inside Fusion 360, and ensure the add-in can connect to the running MCP server.
Notes:
- The basic server functionality requires only Python’s standard library; the OpenAI package is optional for LLM capabilities.
- By default, the server listens on 127.0.0.1:8080; adjust the host/port in the server code if needed.
Additional notes
Tips and common considerations:
- If you encounter connectivity issues, verify that the server is running and that the firewall allows TCP connections on the configured port.
- OpenAI integration is optional; set OPENAI_API_KEY in the environment when you plan to use llm_request/llm features.
- The client library (client.py) provides convenience methods; use get_model_info to fetch model metadata and fusion_command to execute actions in Fusion 360.
- When deploying on Windows, ensure Python scripts are executed with the correct associations or use an appropriate launcher (e.g., pythonw for background processes).
- For Fusion 360 add-in issues, ensure the resources directory with MCPIcon assets exists in the add-in folder and that the add-in is loaded from the correct path in the Fusion 360 UI.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP