AIFoundry Connector-FabricGraphQL
MCP Client and Server apps to demo integration of Azure OpenAI-based AI agent with a Data Warehouse, exposed through GraphQL in Microsoft Fabric.
claude mcp add --transport stdio lazauk-aifoundry-mcpconnector-fabricgraphql python MCP_Client_Gradio.py \ --env AOAI_API_BASE="Base URL of the Azure OpenAI endpoint" \ --env AOAI_DEPLOYMENT="Deployment name of the Azure OpenAI model" \ --env AOAI_API_VERSION="API version of the Azure OpenAI endpoint" \ --env AZURE_FABRIC_GRAPHQL_ENDPOINT="Microsoft Fabric GraphQL API endpoint"
How to use
This MCP server enables an Azure OpenAI powered agent to access and manipulate data stored in a Microsoft Fabric GraphQL API through the MCP framework. The integration exposes the GraphQL data resources to the AI agent via MCP tools and prompts, enabling bidirectional data interaction (queries and mutations) with your Fabric data warehouse. The server is driven by a Python-based Gradio UI, which also serves as the MCP client to initialize the system and establish connectivity to Fabric's GraphQL endpoint. Use the provided environment variables to configure authentication, API endpoints, and the GraphQL connection for seamless data access during agent reasoning and action execution.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Git installed
- Access to the repository containing MCP_Client_Gradio.py and requirements.txt
Step-by-step installation:
-
Clone the repository: git clone https://github.com/your-org/your-repo.git cd your-repo
-
Create and activate a Python virtual environment (optional but recommended):
- On Windows: python -m venv venv .\venv\Scripts\activate
- On macOS/Linux: python3 -m venv venv source venv/bin/activate
-
Install required Python packages: pip install -r requirements.txt
-
Configure environment variables (see below) and run the client: python MCP_Client_Gradio.py
Additional notes
Environment variables:
- AOAI_API_BASE: Base URL of the Azure OpenAI endpoint
- AOAI_API_VERSION: API version of the Azure OpenAI endpoint
- AOAI_DEPLOYMENT: Deployment name of the Azure OpenAI model
- AZURE_FABRIC_GRAPHQL_ENDPOINT: GraphQL endpoint URL for Fabric
Common issues:
- Ensure the GraphQL endpoint is reachable from your execution environment.
- If the OpenAI endpoint requires authentication, ensure AOAI_API_BASE and related vars are correctly set.
- If using a firewall or VPN, confirm outbound access to the GraphQL and OpenAI endpoints.
Troubleshooting tips:
- If the Gradio UI fails to launch, verify Python dependencies in requirements.txt.
- Check that the MCP server script path (MCP_Client_Gradio.py) exists in the working directory.
- Validate that environment variables are exported in your shell session before starting the client.
Related MCP Servers
chunkhound
Local first codebase intelligence
multimodal-agents-course
An MCP Multimodal AI Agent with eyes and ears!
skillz
An MCP server for loading skills (shim for non-claude clients).
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
zerodha
Zerodha MCP Server & Client - AI Agent (w/Agno & w/Google ADK)
openai -agent-dotnet
Sample to create an AI Agent using OpenAI models with any MCP server running on Azure Container Apps