cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
claude mcp add --transport stdio grab-cursor-talk-to-figma-mcp bun cursor-talk-to-figma-mcp@latest
How to use
This MCP server exposes a rich set of Figma automation tools that let an AI agent read, inspect, and modify Figma documents through a Socket-based MCP interface. After starting the WebSocket server and installing the MCP server in your Cursor configuration, you can join a channel and start issuing commands to query document information, read and modify selections, annotate designs, and create or adjust elements and layouts. The tools are categorized into document/selection, annotations, prototyping and connections, creating elements, text modification, layout and styling, and project-wide operations, enabling end-to-end design tasks from an AI agent.
To use, first connect to the WebSocket server from your Cursor MCP setup, then call the available MCP functions like get_document_info, get_selection, read_my_design, set_text_content, create_rectangle, create_text, move_node, resize_node, or join_channel to establish a communication channel with Figma. You can also use higher-level helpers such as scan_nodes_by_types, get_reactions, and create_connections to automate workflows, annotate designs, or propagate instance properties across components. Remember to join a channel before sending commands and start with a document overview before drilling into specific nodes.
How to install
Prerequisites:\n- Install Bun (https://bun.sh) or Node environment compatible with the Bun-based setup in this project.\n- A running Figma workspace and the Cursor MCP integration configured in your environment.\n- Access to the internet to fetch the MCP package and any required dependencies.\n\nStep-by-step installation:\n1) Install Bun (if not already installed):\nbash\ncurl -fsSL https://bun.sh/install | bash\n\n2) Install and set up the MCP server for Cursor:\nbash\nbun setup\n\n3) Start the WebSocket server that enables communication with the Figma plugin:\nbash\nbun socket\n\n4) (Optional) For local development, run the MCP server directly from the repository path:\nbash\nbun \n /// this example runs server.ts from the local path as shown in the docs:\n /path-to-repo/src/talk_to_figma_mcp/server.ts\n\n5) In Cursor, add the MCP server under your MCP configuration (e.g., TalkToFigma) with the appropriate command and arguments. See the local development example in the repository for details.
Additional notes
Tips and caveats:\n- Ensure your Bun/socket server is reachable by the Figma plugin and that any firewall or network restrictions allow WebSocket traffic.\n- When using Windows WSL, you may need to uncomment hostname: 0.0.0.0 in src/socket.ts to allow cross-environment connections.\n- Use design prompts and MCP prompts (e.g., design_strategy, read_design_strategy) to guide complex design tasks and maintain consistency.\n- Always start by calling get_document_info to confirm the current document context and then inspect the current selection with get_selection before making changes.\n- If you encounter permission issues while creating nodes or applying changes, verify that the Figma plugin is correctly connected to the active channel via join_channel.\n- For local testing, route commands to the local server path as shown in the Local Development Setup, and update mcp.json to point to your local server script.
Related MCP Servers
dify
Production-ready platform for agentic workflow development.
ragflow
RAGFlow is a leading open-source Retrieval-Augmented Generation (RAG) engine that fuses cutting-edge RAG with Agent capabilities to create a superior context layer for LLMs
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
deepchat
🐬DeepChat - A smart assistant that connects powerful AI to your personal world
octocode
MCP server for semantic code research and context generation on real-time using LLM patterns | Search naturally across public & private repos based on your permissions | Transform any accessible codebase/s into AI-optimized knowledge on simple and complex flows | Find real implementations and live docs from anywhere
OpenContext
A personal context store for AI agents and assistants—reuse your existing coding agent CLI (Codex/Claude/OpenCode) with built‑in Skills/tools and a desktop GUI to capture, search, and reuse project knowledge across agents and repos.