Mcp -fabric
MCP server from PipeDionisio/Mcp-server-fabric
claude mcp add --transport stdio pipedionisio-mcp-server-fabric docker run -i pipedionisio/mcp-server-fabric
How to use
The MCP server for Fabric acts as a centralized host that exposes Fabric patterns as tools within a Model Context Protocol (MCP) workflow. It enables Claude (or compatible agents) to load, manage, and sequentially apply a set of Fabric patterns as tools, allowing the AI to interact with each pattern like a USB-style extension. This setup is especially useful for guiding the model through a structured chain of thought, leveraging prompt filtering by tags to apply the most relevant patterns (e.g., analysis, writing, coding, data_analysis) for a given task. You can compose tool-enabled prompts where the MCP server dynamically selects and executes relevant Fabric patterns based on the input, returning precise, provenance-rich results that reflect the integrated tool usage.
To use it, run the MCP server and point your MCP-enabled agent or workflow (such as N8N) to its endpoints. Within your workflow, you can filter prompts by tags to apply the right Fabric patterns for the context (for example, using analysis patterns for risk assessment or coding patterns for feature work). The server treats each Fabric pattern as an interchangeable tool, enabling you to assemble sophisticated tool-enabled pipelines without embedding tool logic directly in the agent. The overall experience is designed to feel like giving the AI a robust toolbox, with clear separation between tool invocation and reasoning.
How to install
Prerequisites:
- Docker installed and running on your machine or host
- Access to the Fabric patterns library you want to expose via MCP-Server-Fabric
Install steps:
- Pull and run the MCP server container for Fabric:
docker pull pipedionisio/mcp-server-fabric
docker run -it --rm --name mcp-server-fabric -p 8080:8080 pipedionisio/mcp-server-fabric
- Verify the server is up by checking logs or hitting the health endpoint (adjust URL/port if needed).
- Configure your MCP client (e.g., Claude integration or N8N) to point to the MCP server URL and port (http://<host>:8080).
- (Optional) Set up authentication or environment-specific credentials as required by your deployment environment.
- In your workflow, reference Fabric patterns as tools via the MCP server and begin sending prompts that leverage tag-based filtering to activate the appropriate patterns.
Notes:
- If you encounter SSL or protocol errors, ensure the MCP server and client agree on the endpoint protocol and that intermediate proxies are not altering TLS handshakes.
- If you hit a limit like simultaneous calls, consider adjusting the client side concurrency or restarting the N8N instance as described in the guidance for your setup.
Additional notes
Tips and common issues:
- Tag filtering: Use the provided tag list (e.g., analysis, coding, writing, data_analysis) to steer which Fabric pattern tools are invoked.
- Performance: Pattern-heavy workflows can be resource-intensive; monitor CPU/memory and scale the host or container accordingly.
- N8N credentials: If updating N8N API credentials, ensure the MCP server can access them or store credentials securely in your orchestration layer.
- SSL errors: The guide mentions a common SSL error; if you see wrong-version errors, restarting N8N in Docker often resolves transient SSL handshake issues due to API limits.
- Updates: Keep the Fabric patterns and MCP-Server image in sync to avoid incompatibilities between tool formats and the MCP protocol.