mcp-design-system-extractor
MCP (Model Context Protocol) server that enables AI assistants to interact with Storybook design systems. Extract component HTML, analyze styles, and help with design system adoption and refactoring.
claude mcp add --transport stdio freema-mcp-design-system-extractor node /path/to/dist/index.js \ --env STORYBOOK_URL="http://localhost:6006" \ --env NODE_TLS_REJECT_UNAUTHORIZED="1"
How to use
This MCP server, mcp-design-system-extractor, connects to a Storybook instance and extracts component information such as rendered HTML, styles, and metadata. It offers a suite of tools to discover components, fetch HTML (async by default with a background job model), analyze dependencies, and extract theme and external CSS tokens. Use list_components to discover available components; then get_component_html to render a component’s HTML (optionally with styles) either asynchronously (default) or synchronously. You can search by name, category, or purpose, inspect which components are used inside others, and retrieve theme information like colors and typography. The server is designed to work with Storybook distributions and can operate with self-signed certificates when NODE_TLS_REJECT_UNAUTHORIZED is set to 0.
How to install
Prerequisites:
- Node.js and npm installed on your system
- Access to a Storybook instance you want to connect to
Installation steps:
# 1) Install the MCP Design System Extractor globally (recommended)
npm install -g mcp-design-system-extractor
# 2) Run the interactive setup or configure via Claude CLI as described in the docs
# If you clone from source, follow the build steps below
From Source:
# Clone the repository
git clone https://github.com/freema/mcp-design-system-extractor.git
cd mcp-design-system-extractor
# Install dependencies
npm install
# Build the project
npm run build
# Optional: run interactive setup if you use Claude Desktop integration
npm run setup
Runtime usage:
# Start the server (example command; adapt path as needed)
node /path/to/dist/index.js
Configuration note: The MCP config file should point to the built entry point (index.js) in dist and provide environment variables like STORYBOOK_URL and NODE_TLS_REJECT_UNAUTHORIZED when needed.
Additional notes
Tips & caveats:
- STORYBOOK_URL is required to connect to your Storybook instance (default: http://localhost:6006).
- NODE_TLS_REJECT_UNAUTHORIZED can be set to 0 to allow self-signed certificates when connecting to HTTPS Storybook instances (use with caution in production).
- Async HTML extraction returns a job_id; poll with job_status until status is completed. For quick inline HTML, use synchronous mode by setting async: false in get_component_html.
- The explorer relies on Puppeteer for rendering dynamic Storybook content, so ensure Chrome/Chromium dependencies are available in your environment (Docker can handle this automatically).
- When using with Claude or automated clients, you can leverage the available tools: list_components, get_component_html, search_components, get_theme_info, get_external_css, and the job management tools (job_status, job_list, job_cancel).
- For large design systems, prefer compact outputs from list_components and use variantsOnly on get_component_html to discover available variants before rendering full HTML.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
akyn-sdk
Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.
promptboard
The Shared Whiteboard for Your AI Agents via MCP. Paste screenshots, mark them up, and share with AI.