fabric-data-agent-remediate
npx machina-cli add skill PatrickGallucci/fabric-skills/fabric-data-agent-remediate --openclawFabric Data Agent remediate
Structured remediate guide for diagnosing and resolving issues with Microsoft Fabric Data Agent (preview). Covers the full lifecycle from tenant configuration through data source setup, query tuning, publishing, and external integration.
When to Use This Skill
- Fabric data agent cannot be created or is missing from the workspace
- Data agent returns errors, empty results, or inaccurate queries
- Tenant settings are misconfigured for Copilot and Azure OpenAI
- Power BI semantic model data source fails to connect (XMLA)
- Cross-region capacity errors prevent query execution
- Example queries fail validation or are ignored by the agent
- Published data agent is inaccessible to shared users
- Azure AI Foundry or Copilot Studio integration issues
- Agent generates incorrect SQL, DAX, or KQL queries
- Conversation history is lost or not persisting
Prerequisites
- Microsoft Fabric capacity: F2 or higher (or Power BI Premium P1+ with Fabric enabled)
- Fabric Admin Portal access (for tenant settings)
- At least one data source with data: lakehouse, warehouse, Power BI semantic model, KQL database, or ontology
- Workspace Contributor (or higher) permissions
- PowerShell 7+ recommended for diagnostic scripts
Quick Diagnosis Workflow
Follow this decision tree to rapidly identify your issue category:
- Cannot create data agent? → See Tenant Settings Checklist
- Data agent created but queries fail? → See Query remediate
- Data source not appearing or erroring? → See Data Source Connectivity
- Published agent inaccessible? → See Publishing and Sharing
- Integration with Foundry/Teams failing? → See External Integration
Known Limitations (Preview)
These are product limitations, not configuration errors. Do not troubleshoot these as bugs:
| Limitation | Detail |
|---|---|
| Read-only queries | Agent generates only SELECT/read queries; no INSERT, UPDATE, or DELETE |
| English only | Non-English questions, instructions, and examples are not supported |
| No unstructured data | PDF, DOCX, TXT files cannot be used as data sources |
| Lakehouse files | Agent reads lakehouse tables only, not standalone CSV/JSON files |
| Fixed LLM | The underlying LLM model cannot be changed |
| Max 5 data sources | Up to five data sources in any combination per agent |
| Max 100 example queries | Per data source limit for example query pairs |
| Cross-region blocked | Data source and agent capacities must be in the same region |
| Conversation history | May reset during backend updates or model upgrades |
| No PBI example queries | Power BI semantic models do not support sample query/question pairs |
Common Error Patterns
| Symptom | Likely Cause | Quick Fix |
|---|---|---|
| "Data agent" item type missing | Tenant setting disabled | Enable "Fabric data agent" in Admin Portal |
| Agent created but no response | Copilot tenant switch off | Enable "Users can use Copilot and other features powered by Azure OpenAI" |
| Cross-geo processing error | Cross-geo settings disabled | Enable both cross-geo processing AND storing settings |
| XMLA connection failure | XMLA endpoints not enabled | Enable "Allow XMLA endpoints" in Integration settings |
| Query returns empty/wrong results | Poor table/column names or missing examples | Add descriptive names and example query pairs |
| "Cannot execute query" error | Capacity region mismatch | Move agent or data source to same region capacity |
| First queries fail after creation | Agent initialization delay | Wait 2-3 minutes after creation before querying |
| Example queries ignored | Invalid SQL/KQL syntax | Validate all example queries match schema exactly |
| Published agent not visible | Permissions not shared | Share agent with Read permission to target users |
| 403 Forbidden on data source | User lacks data access | Grant workspace Contributor or data Read permissions |
Diagnostic Script
Run the Fabric Data Agent Diagnostic PowerShell script to validate tenant settings and connectivity prerequisites programmatically.
./scripts/Test-FabricDataAgentConfig.ps1 -WorkspaceId "<guid>" -Verbose
Configuration Validation Template
Use the Configuration Checklist template to document and track your data agent configuration for team handoff or support tickets.
References
- Tenant Settings Checklist - Complete tenant configuration walkthrough
- Query remediate - NL2SQL/NL2DAX/NL2KQL diagnosis
- Data Source Connectivity - Lakehouse, warehouse, KQL, PBI, ontology issues
- Publishing and Sharing - Publish, share, and consume data agents
- External Integration - Azure AI Foundry, Copilot Studio, Teams integration
- Microsoft Learn: Create a Fabric data agent
- Microsoft Learn: Data agent concepts
- Microsoft Learn: Tenant settings
- Microsoft Learn: Data agent SDK
- Microsoft Learn: Evaluate your data agent
Source
git clone https://github.com/PatrickGallucci/fabric-skills/blob/main/skills/fabric-data-agent-remediate/SKILL.mdView on GitHub Overview
Diagnose and resolve Microsoft Fabric Data Agent issues across tenant settings, data sources, query generation, publishing, and external integrations. This guide covers the full remediation lifecycle from setup to validation and sharing.
How This Skill Works
Follow a structured remediation workflow: verify prerequisites, run the Quick Diagnosis Workflow to isolate issue categories, and apply category-specific fixes from Tenant Settings, Data Source Connectivity, Query Remediation, Publishing/Sharing, or External Integration guides. Validate changes with sample queries and re-test end-to-end.
When to Use It
- Fabric data agent cannot be created or is missing from the workspace
- Data agent returns errors, empty results, or inaccurate queries
- Tenant settings are misconfigured for Copilot and Azure OpenAI
- Power BI semantic model data source fails to connect (XMLA)
- Cross-region capacity errors prevent query execution
Quick Start
- Step 1: Verify prerequisites (capacity, admin access, data sources, permissions)
- Step 2: Run the Quick Diagnosis Workflow to identify the issue category
- Step 3: Apply fixes from the appropriate remediation guide and re-validate
Best Practices
- Confirm prerequisites before remediation: Fabric capacity, admin portal access, and at least one data source with data
- Use the Quick Diagnosis Workflow first to categorize issues quickly
- Ensure region alignment: data sources and agent must be in the same region
- Validate fixes with example queries and XMLA connectivity before publishing
- Document changes, notify stakeholders, and re-test end-to-end after fixes
Example Use Cases
- Tenant settings corrected to enable Copilot and Azure OpenAI integration
- XMLA endpoint connection restored for a Power BI semantic model data source
- Cross-region capacity error resolved by aligning region of data sources and agent
- NL2SQL/NL2DAX/NL2KQL query generation fixed and validated against examples
- Published data agent made accessible to shared users after remediation