fabric-atelier
A high-performance atelier for Fabric patterns - MCP server built with Rust + Apache Arrow
claude mcp add --transport stdio copyleftdev-fabric-atelier docker run -i --rm copyleftdev/fabric-atelier:latest
How to use
Fabric Atelier is a high-performance MCP server that exposes all Fabric patterns as discoverable and executable tools for MCP clients. It loads its 226 Fabric patterns from a git submodule and serves them as MCP tools, enabling pattern discovery via a semantic search (fabric_find_pattern) and direct execution of any individual Fabric pattern (fabric_<pattern_name>), such as fabric_summarize, fabric_extract_wisdom, fabric_analyze_claims, fabric_improve_writing, fabric_explain_code, and many more. Clients like Claude Desktop, Windsurf, and other MCP-enabled assistants can discover and invoke these tools through the MCP protocol, enabling automated reasoning and content processing using Fabric’s pattern library. The server is designed for speed, with sub-millisecond discovery and a lightweight footprint when run in Docker, making it suitable for deployment in containerized environments and CI pipelines.
To use the server, configure an MCP client (or Claude Desktop/Windsurf) to connect to the running Fabric Atelier instance. For Docker deployment, pull the image and run it as shown in the Quick Start, then use the MCP client to discover patterns and invoke tools such as fabric_find_pattern or a specific fabric_<pattern_name> tool. When a tool is invoked, Atelier executes the corresponding Fabric pattern via its internal CLI and returns the results to the MCP client.
How to install
Prerequisites
- Docker is installed and running (for the recommended Docker deployment) or Rust toolchain if building from source.
- Optional: A local LLM (e.g., Ollama) or API keys for OpenAI/Anthropic if you plan to generate embeddings or run patterns that depend on external models.
Option A — Docker (Recommended)
- Install Docker on your host.
- Pull and run the Fabric Atelier image: docker pull copyleftdev/fabric-atelier:latest docker run -i --rm copyleftdev/fabric-atelier:latest
Option B — Build from Source
- Install Rust (1.90+) and Git. Example (Unix-like): curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Clone the repository with submodules: git clone --recursive https://github.com/copyleftdev/fabric-atelier.git cd fabric-atelier
- Build the release binary: cargo build --release The binary will be located at: target/release/fabric-atelier
- If you update Fabric patterns, you can regenerate embeddings:
export OPENAI_API_KEY=your_key_here
cargo run --bin generate-embeddings
Output: data/embeddings.parquet
Additional notes
Environment and configuration tips:
- If using OpenAI or Anthropic APIs for embeddings, set OPENAI_API_KEY or equivalent API keys in your environment before generating embeddings.
- When updating the Fabric submodule, remember to update embeddings afterward: git submodule update --remote data/fabric and then cargo run --bin generate-embeddings.
- For Claude Desktop integration, ensure your MCP configuration file includes the fabric-atelier server with the proper command to launch the local binary or container image. Example: { "mcpServers": { "fabric-atelier": { "command": "/absolute/path/to/fabric-atelier/target/release/fabric-atelier" } } }.
- The server supports running under Docker with a non-root user setup for security; prefer the Docker deployment for ease of use and isolation.
Related MCP Servers
mcp-nixos
MCP-NixOS - Model Context Protocol Server for NixOS resources
lc2mcp
Convert LangChain tools to FastMCP tools
fast-filesystem
A high-performance Model Context Protocol (MCP) server that provides secure filesystem access for Claude and other AI assistants.
claude-additional-models
Reduce Claude Desktop consumption by 10x - Integrate Google's Gemini or Z.ai's GLM-5 (744B params) with Claude via MCP for intelligent task delegation
Mcp.Net
A fully featured C# implementation of Anthropic's Model Context Protocol (MCP)
spec-kit
MCP server enabling AI assistants to use GitHub's spec-kit methodology