fabric-data-agent
npx machina-cli add skill PatrickGallucci/fabric-skills/fabric-data-agent --openclawMicrosoft Fabric Data Agents
Build conversational AI experiences that let users ask questions in plain English against structured data in Microsoft Fabric. Fabric Data Agents translate natural language into SQL, DAX, or KQL queries, execute them securely under the caller's identity, and return data-driven answers.
When to Use This Skill
- Creating a new Fabric Data Agent from the portal or via REST API
- Configuring data sources (Lakehouse, Warehouse, Power BI Semantic Model, KQL Database, Ontology)
- Writing effective agent-level or data-source-level instructions
- Authoring example queries (few-shot examples) to improve NL2SQL/NL2DAX/NL2KQL accuracy
- Automating data agent provisioning with PowerShell and the Fabric REST API
- Integrating a published Fabric Data Agent with Azure AI Foundry agents
- Managing the Operations Agent definition (Configurations.json) programmatically
- Publishing, sharing, and versioning data agents
- remediate query generation, data source permissions, or tenant settings
Prerequisites
| Requirement | Details |
|---|---|
| Fabric capacity | Paid F2+ SKU, or Power BI Premium P1+ with Fabric enabled |
| Tenant settings | Fabric data agent, Cross-geo processing for AI, Cross-geo storing for AI all enabled |
| XMLA endpoints | Enabled if using Power BI Semantic Model data sources |
| Data source access | At least Read permission on target lakehouses, warehouses, semantic models, or KQL databases |
| PowerShell (automation) | PowerShell 7.4+, Az.Accounts module |
| Azure AI Foundry (integration) | Foundry Project endpoint, Fabric connection, model deployment |
Step-by-Step Workflows
Workflow 1: Create and Configure a Data Agent (Portal)
- Navigate to your workspace and select + New Item > Fabric data agent
- Provide a descriptive name for the agent
- Add up to 5 data sources from the OneLake catalog (any mix of Lakehouse, Warehouse, PBI Semantic Model, KQL DB, Ontology)
- Select/deselect tables in the Explorer pane to control what the AI can query
- Write agent instructions — see instruction-best-practices.md
- Add example queries for each data source — see example-query-guide.md
- Test the agent in the chat pane with representative questions
- Select Publish and provide a description
- Share the published version with colleagues via workspace permissions
Workflow 2: Automate Data Agent via REST API (PowerShell)
- Authenticate with
Connect-AzAccountand obtain an access token - Create the agent item using the Fabric Items API — run New-FabricDataAgent.ps1
- Configure the Operations Agent definition (Configurations.json) — see operations-agent-schema.md
- Update the item definition using the Update Item Definition API
- Validate deployment by listing items in the workspace
Workflow 3: Integrate with Azure AI Foundry
- Create and publish the Fabric Data Agent in the Fabric portal
- Create a Foundry Agent in the Azure AI Foundry portal
- Add a Microsoft Fabric connection to your Foundry project
- Create the agent with the Fabric tool enabled — see foundry-integration.md
- Create a thread, add a user question, run the thread, and retrieve the response
- Update Foundry agent instructions to describe what data the Fabric tool can access
remediate
| Symptom | Cause | Resolution |
|---|---|---|
| Data agent option not visible in New Item | Tenant setting disabled | Admin enables "Fabric data agent" in Tenant Settings |
| Agent can't see my tables | Tables not selected in Explorer | Check table checkboxes in the data source Explorer |
| Queries return permission errors | Insufficient data access | Grant at least Read on the underlying data source |
| Example queries show validation errors | SQL/KQL syntax invalid or schema mismatch | Validate queries against the actual table schema |
| Agent misinterprets domain terms | Missing definitions in instructions | Add term definitions in Agent Instructions (up to 15,000 chars) |
| Power BI semantic model won't add | XMLA endpoints disabled | Enable "Power BI semantic models via XMLA endpoints" tenant switch |
| Published agent not accessible to colleagues | Workspace permissions not granted | Share the workspace or agent item with appropriate roles |
References
- Instruction Best Practices — How to write effective agent and data-source instructions
- Example Query Guide — Authoring few-shot examples for NL2SQL/NL2DAX/NL2KQL
- Operations Agent Schema — REST API definition structure (Configurations.json)
- Foundry Integration — Connecting Fabric Data Agents with Azure AI Foundry
- Microsoft Learn: Create a Fabric data agent
- Microsoft Learn: Data agent configuration
Source
git clone https://github.com/PatrickGallucci/fabric-skills/blob/main/skills/fabric-data-agent/SKILL.mdView on GitHub Overview
Create, configure, and manage Microsoft Fabric Data Agents that enable natural language Q&A across lakehouses, warehouses, Power BI semantic models, KQL databases, and ontologies. This skill covers building agents via portal or REST API, writing agent and data-source instructions, crafting example queries, automating provisioning with PowerShell, and integrating with Azure AI Foundry, plus troubleshooting configurations.
How This Skill Works
Fabric Data Agents translate natural language into SQL, DAX, or KQL queries and execute them securely under the caller's identity. You define data sources, write agent- and data-source instructions, and provide few-shot examples to improve NL2SQL/NL2DAX/NL2KQL accuracy. Provisioning and management are powered by REST API or PowerShell, with optional integration into Azure AI Foundry for broader AI workflows.
When to Use It
- Create a new Fabric Data Agent from the portal or via REST API
- Configure data sources (Lakehouse, Warehouse, PBI Semantic Model, KQL Database, Ontology)
- Write effective agent-level or data-source-level instructions and add example queries
- Automate data agent provisioning with PowerShell and the Fabric REST API
- Integrate a published Fabric Data Agent with Azure AI Foundry or troubleshoot configuration issues
Quick Start
- Step 1: Create a Fabric Data Agent in your workspace, add up to 5 data sources (Lakehouse, Warehouse, PBI Semantic Model, KQL DB, Ontology), write agent instructions, add example queries, test in chat, then Publish with a description
- Step 2: Automate provisioning with PowerShell: Connect-AzAccount, run New-FabricDataAgent.ps1, configure Configurations.json, and use Update Item Definition API to finalize
- Step 3: (Optional) Integrate with Azure AI Foundry or share versioned agents via workspace permissions; monitor and adjust data-source permissions as needed
Best Practices
- Plan data sources and ensure Read permissions are granted for each target
- Write concise, explicit agent and data-source instructions aligned with each source
- Include representative few-shot example queries to boost NL2SQL/NL2DAX/NL2KQL accuracy
- Test the agent in the chat pane with diverse, representative questions before publishing
- Version and publish agents carefully; manage Configurations.json and automate changes via REST API/PowerShell
Example Use Cases
- A multi-source agent querying a Lakehouse and a KQL database with NL2SQL/NL2KQL flows
- An agent connected to a Power BI semantic model enabling NLQ against metrics and tables
- Automated provisioning of agents using New-FabricDataAgent.ps1 and the Fabric REST API
- Azure AI Foundry integration to orchestrate Fabric Data Agents across projects
- A shared, permissions-scoped agent with multiple data sources and example queries