Fabric-Analytics
A Model Context Protocol (MCP) server that enables AI assistants to securely access and analyze Microsoft Fabric Analytics data through authenticated API calls.
claude mcp add --transport stdio santhoshravindran7-fabric-analytics-mcp python -m fabric_analytics_mcp \ --env PORT="5000" \ --env LOG_LEVEL="INFO" \ --env FABRIC_API_KEY="your-fabric-api-key" \ --env ASSET_STORAGE_PATH="path/to/storage"
How to use
Fabric-Analytics MCP Server exposes a comprehensive set of MCP tools to interact with the Microsoft Fabric platform. It offers workspace management, capacity and resource handling, authentication helpers, and a suite of analytics and monitoring commands designed for AI assistants and automation workflows. Tools cover creating and managing Fabric workspaces, assigning and listing capacities, configuring Git integrations, and provisioning environments and data pipelines. When used with an MCP-enabled assistant, you can query and operate Fabric resources in a structured, protocol-friendly manner, enabling real-time monitoring, automated provisioning, and governance across your Fabric deployments.
How to install
Prerequisites:
- Python 3.8+ and pip
- Access to a host with network connectivity to Fabric resources
- Optional: a compatible MCP client or controller depending on your environment
Install from PyPI (recommended):
pip install fabric-analytics-mcp
Run locally (example):
python -m fabric_analytics_mcp
Environment variables (example):
- PORT: 5000
- FABRIC_API_KEY: your-fabric-api-key
- ASSET_STORAGE_PATH: /var/lib/fabric-analytics
- LOG_LEVEL: INFO|DEBUG
If you prefer a Node.js deployment, you can install the MCP server package from npm and run it with Node. The repository supports both Python and Node deployments; choose the path that matches your environment.
Node.js route (if using Node):
npm install -g fabric-analytics-mcp fabric-analytics-mcp
Configure startup to run as a service or in a container as needed.
Additional notes
Notes and tips:
- Ensure your FABRIC_API_KEY (or equivalent credentials) has the required permissions to manage workspaces, capacities, and environments.
- If you deploy in Kubernetes, configure readiness and liveness probes to monitor the MCP server health.
- Use the provided env vars to tune logging, storage paths, and ports according to your environment.
- When upgrading, verify compatibility with your existing MCP clients and AI tooling to avoid protocol drift.
- For debugging, run with LOG_LEVEL=DEBUG and inspect the logs for RPC/HTTP metadata and error traces.
Related MCP Servers
mcp-for-beginners
This open-source curriculum introduces the fundamentals of Model Context Protocol (MCP) through real-world, cross-language examples in .NET, Java, TypeScript, JavaScript, Rust and Python. Designed for developers, it focuses on practical techniques for building modular, scalable, and secure AI workflows from session setup to service orchestration.
bitbucket
Bitbucket MCP - A Model Context Protocol (MCP) server for integrating with Bitbucket Cloud and Server APIs
mcp-arr
MCP server for *arr media management suite
crawlbase
Crawlbase MCP Server connects AI agents and LLMs with real-time web data. It powers Claude, Cursor, and Windsurf integrations with battle-tested web scraping, JavaScript rendering, and anti-bot protection enabling structured, live data inside your AI workflows.
dockashell
DockaShell is an MCP server that gives AI agents isolated Docker containers to work in. MCP tools for shell access, file operations, and full audit trail.
google-knowledge-graph
MCP server for Google's free public Knowledge Graph Search API - search for structured entity information about people, places, organizations, and concepts