Get the FREE Ultimate OpenClaw Setup Guide →

fabric-data-agent

npx machina-cli add skill PatrickGallucci/fabric-skills/fabric-data-agent --openclaw
Files (1)
SKILL.md
6.6 KB

Microsoft Fabric Data Agents

Build conversational AI experiences that let users ask questions in plain English against structured data in Microsoft Fabric. Fabric Data Agents translate natural language into SQL, DAX, or KQL queries, execute them securely under the caller's identity, and return data-driven answers.

When to Use This Skill

  • Creating a new Fabric Data Agent from the portal or via REST API
  • Configuring data sources (Lakehouse, Warehouse, Power BI Semantic Model, KQL Database, Ontology)
  • Writing effective agent-level or data-source-level instructions
  • Authoring example queries (few-shot examples) to improve NL2SQL/NL2DAX/NL2KQL accuracy
  • Automating data agent provisioning with PowerShell and the Fabric REST API
  • Integrating a published Fabric Data Agent with Azure AI Foundry agents
  • Managing the Operations Agent definition (Configurations.json) programmatically
  • Publishing, sharing, and versioning data agents
  • remediate query generation, data source permissions, or tenant settings

Prerequisites

RequirementDetails
Fabric capacityPaid F2+ SKU, or Power BI Premium P1+ with Fabric enabled
Tenant settingsFabric data agent, Cross-geo processing for AI, Cross-geo storing for AI all enabled
XMLA endpointsEnabled if using Power BI Semantic Model data sources
Data source accessAt least Read permission on target lakehouses, warehouses, semantic models, or KQL databases
PowerShell (automation)PowerShell 7.4+, Az.Accounts module
Azure AI Foundry (integration)Foundry Project endpoint, Fabric connection, model deployment

Step-by-Step Workflows

Workflow 1: Create and Configure a Data Agent (Portal)

  1. Navigate to your workspace and select + New Item > Fabric data agent
  2. Provide a descriptive name for the agent
  3. Add up to 5 data sources from the OneLake catalog (any mix of Lakehouse, Warehouse, PBI Semantic Model, KQL DB, Ontology)
  4. Select/deselect tables in the Explorer pane to control what the AI can query
  5. Write agent instructions — see instruction-best-practices.md
  6. Add example queries for each data source — see example-query-guide.md
  7. Test the agent in the chat pane with representative questions
  8. Select Publish and provide a description
  9. Share the published version with colleagues via workspace permissions

Workflow 2: Automate Data Agent via REST API (PowerShell)

  1. Authenticate with Connect-AzAccount and obtain an access token
  2. Create the agent item using the Fabric Items API — run New-FabricDataAgent.ps1
  3. Configure the Operations Agent definition (Configurations.json) — see operations-agent-schema.md
  4. Update the item definition using the Update Item Definition API
  5. Validate deployment by listing items in the workspace

Workflow 3: Integrate with Azure AI Foundry

  1. Create and publish the Fabric Data Agent in the Fabric portal
  2. Create a Foundry Agent in the Azure AI Foundry portal
  3. Add a Microsoft Fabric connection to your Foundry project
  4. Create the agent with the Fabric tool enabled — see foundry-integration.md
  5. Create a thread, add a user question, run the thread, and retrieve the response
  6. Update Foundry agent instructions to describe what data the Fabric tool can access

remediate

SymptomCauseResolution
Data agent option not visible in New ItemTenant setting disabledAdmin enables "Fabric data agent" in Tenant Settings
Agent can't see my tablesTables not selected in ExplorerCheck table checkboxes in the data source Explorer
Queries return permission errorsInsufficient data accessGrant at least Read on the underlying data source
Example queries show validation errorsSQL/KQL syntax invalid or schema mismatchValidate queries against the actual table schema
Agent misinterprets domain termsMissing definitions in instructionsAdd term definitions in Agent Instructions (up to 15,000 chars)
Power BI semantic model won't addXMLA endpoints disabledEnable "Power BI semantic models via XMLA endpoints" tenant switch
Published agent not accessible to colleaguesWorkspace permissions not grantedShare the workspace or agent item with appropriate roles

References

Source

git clone https://github.com/PatrickGallucci/fabric-skills/blob/main/skills/fabric-data-agent/SKILL.mdView on GitHub

Overview

Create, configure, and manage Microsoft Fabric Data Agents that enable natural language Q&A across lakehouses, warehouses, Power BI semantic models, KQL databases, and ontologies. This skill covers building agents via portal or REST API, writing agent and data-source instructions, crafting example queries, automating provisioning with PowerShell, and integrating with Azure AI Foundry, plus troubleshooting configurations.

How This Skill Works

Fabric Data Agents translate natural language into SQL, DAX, or KQL queries and execute them securely under the caller's identity. You define data sources, write agent- and data-source instructions, and provide few-shot examples to improve NL2SQL/NL2DAX/NL2KQL accuracy. Provisioning and management are powered by REST API or PowerShell, with optional integration into Azure AI Foundry for broader AI workflows.

When to Use It

  • Create a new Fabric Data Agent from the portal or via REST API
  • Configure data sources (Lakehouse, Warehouse, PBI Semantic Model, KQL Database, Ontology)
  • Write effective agent-level or data-source-level instructions and add example queries
  • Automate data agent provisioning with PowerShell and the Fabric REST API
  • Integrate a published Fabric Data Agent with Azure AI Foundry or troubleshoot configuration issues

Quick Start

  1. Step 1: Create a Fabric Data Agent in your workspace, add up to 5 data sources (Lakehouse, Warehouse, PBI Semantic Model, KQL DB, Ontology), write agent instructions, add example queries, test in chat, then Publish with a description
  2. Step 2: Automate provisioning with PowerShell: Connect-AzAccount, run New-FabricDataAgent.ps1, configure Configurations.json, and use Update Item Definition API to finalize
  3. Step 3: (Optional) Integrate with Azure AI Foundry or share versioned agents via workspace permissions; monitor and adjust data-source permissions as needed

Best Practices

  • Plan data sources and ensure Read permissions are granted for each target
  • Write concise, explicit agent and data-source instructions aligned with each source
  • Include representative few-shot example queries to boost NL2SQL/NL2DAX/NL2KQL accuracy
  • Test the agent in the chat pane with diverse, representative questions before publishing
  • Version and publish agents carefully; manage Configurations.json and automate changes via REST API/PowerShell

Example Use Cases

  • A multi-source agent querying a Lakehouse and a KQL database with NL2SQL/NL2KQL flows
  • An agent connected to a Power BI semantic model enabling NLQ against metrics and tables
  • Automated provisioning of agents using New-FabricDataAgent.ps1 and the Fabric REST API
  • Azure AI Foundry integration to orchestrate Fabric Data Agents across projects
  • A shared, permissions-scoped agent with multiple data sources and example queries

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers