From fabric-skills
Authors, updates, deletes, and manages Fabric Dataflows Gen2 via CLI using az rest/curl with base64-encoded Power Query M mashup definitions. Supports refreshes, connections, CI/CD exports.
npx claudepluginhub microsoft/skills-for-fabric --plugin skills-for-fabricThis skill uses the workspace's default tool permissions.
> **Update Check — ONCE PER SESSION (mandatory)**
Monitors, inspects, and discovers Fabric Dataflows Gen2 via read-only CLI (az rest/curl). Lists dataflows, decodes base64 definitions for Power Query M and metadata, discovers parameters, polls refresh status, retrieves job history.
Manages Microsoft Fabric resources including workspaces, semantic models, reports, notebooks, lakehouses using fab CLI for deployment, job execution, data querying, OneLake file management, and automation.
Guides on Azure Data Factory validation rules including activity nesting limitations, ForEach restrictions, pipeline validation, linked service authentication, resource limits, Set Variable rules, and Data Flow constraints.
Share bugs, ideas, or general feedback.
Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
check-updatesskill.- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items in Fabric | Mandatory — READ link first [needed for finding workspace id by its name or item id by its name, item type, and workspace id] |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology & Key Concepts | |
| Environment URLs | COMMON-CORE.md § Environment URLs | |
| Authentication & Token Acquisition | COMMON-CORE.md § Authentication & Token Acquisition | Wrong audience = 401; read before any auth issue |
| Core Control-Plane REST APIs | COMMON-CORE.md § Core Control-Plane REST APIs | Includes pagination, LRO polling, and rate-limiting patterns |
| Definition Envelope | ITEM-DEFINITIONS-CORE.md § Definition Envelope | Definition payload structure |
| Per-Item-Type Definitions | ITEM-DEFINITIONS-CORE.md § Per-Item-Type Definitions | Support matrix, decoded content, part paths — REST specs, CLI recipes |
| Job Execution | COMMON-CORE.md § Job Execution | |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection Rationale | |
| Authentication Recipes | COMMON-CLI.md § Authentication Recipes | az login flows and token acquisition |
Fabric Control-Plane API via az rest | COMMON-CLI.md § Fabric Control-Plane API via az rest | Always pass --resource; includes pagination and LRO helpers |
| Item CRUD Operations | COMMON-CLI.md § Item CRUD Operations | Create, get/update definition, delete patterns |
| Job Execution (CLI) | COMMON-CLI.md § Job Execution | |
| Gotchas & Troubleshooting (CLI-Specific) | COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific) | az rest audience, shell escaping, token expiry |
| Quick Reference | COMMON-CLI.md § Quick Reference | az rest template + token audience/tool matrix |
| Authoring Capability Matrix | DATAFLOWS-AUTHORING-CORE.md § Authoring Capability Matrix | Full CRUD lifecycle, parameterized refresh, Git CI/CD |
| Dataflow Definition Structure | DATAFLOWS-AUTHORING-CORE.md § Dataflow Definition Structure | Read first — 3-part definition: queryMetadata.json, mashup.pq, .platform |
| REST API Surface (Authoring) | DATAFLOWS-AUTHORING-CORE.md § REST API Surface | Create, getDefinition, updateDefinition, delete, run job |
| Power Query M Code Structure | DATAFLOWS-AUTHORING-CORE.md § Power Query M Code Structure | Section documents, multiple queries, parameters, fast copy |
| Connection Model | DATAFLOWS-AUTHORING-CORE.md § Connection Model | Connection kinds, path patterns, connectionId requirements |
| Job Execution Patterns | DATAFLOWS-AUTHORING-CORE.md § Job Execution Patterns | Trigger refresh, LRO polling, parameterized refresh |
| ALM / Git Integration | DATAFLOWS-AUTHORING-CORE.md § ALM / Git Integration | Export/import, CI/CD pipeline, cross-environment overrides |
| Gotchas and Troubleshooting | DATAFLOWS-AUTHORING-CORE.md § Gotchas and Troubleshooting | 15-row issue/cause/resolution table |
| Quick Reference: Authoring Decision Guide | DATAFLOWS-AUTHORING-CORE.md § Quick Reference | Scenario → recommended approach lookup |
| Core Authoring via CLI | authoring-cli-quickref.md § Core Authoring via CLI | Create, get/update definition, delete, refresh az rest one-liners |
| Base64 Encoding Helpers | authoring-cli-quickref.md § Base64 Encoding Helpers | Encode/decode definition parts in bash and PowerShell |
| Definition Manipulation | authoring-cli-quickref.md § Definition Manipulation Patterns | Read-modify-write workflow for definitions |
| Bash Templates | authoring-script-templates.md § Bash Templates | Create dataflow, read-modify-write, refresh with LRO, CI/CD export/import |
| PowerShell Templates | authoring-script-templates.md § PowerShell Templates | Create dataflow, trigger refresh with polling |
| Tool Stack | SKILL.md § Tool Stack | az CLI + jq + base64; verify before first op |
| Connection | SKILL.md § Connection | Workspace/dataflow ID discovery via REST |
| Agentic Workflows | SKILL.md § Agentic Workflows | Start here — discover→formulate→execute→verify |
| Gotchas, Rules, Troubleshooting | SKILL.md § Gotchas, Rules, Troubleshooting | MUST DO / AVOID / PREFER checklists |
| Agent Integration Notes | authoring-cli-quickref.md § Agent Integration Notes | Platform-specific tips (Copilot CLI, Claude Code) |
| Tool | Role | Install |
|---|---|---|
az CLI | Primary: Auth (az login), REST API calls (az rest), token acquisition. | Pre-installed in most dev environments |
jq | Parse and manipulate JSON responses and definition payloads. | Pre-installed or trivial |
base64 | Encode/decode definition parts for the REST API. | Built into bash / [Convert]::ToBase64String() in PowerShell |
curl | Alternative to az rest when raw HTTP control is needed. | Pre-installed |
Agent check — verify before first operation:
az --version 2>/dev/null || echo "INSTALL: https://aka.ms/install-azure-cli" jq --version 2>/dev/null || echo "INSTALL: apt-get install jq OR brew install jq"
Per COMMON-CLI.md Finding Workspaces and Items in Fabric:
# List workspaces — find workspace ID by name
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces" \
--query "value[?displayName=='MyWorkspace'].id" --output tsv
# List dataflows in workspace — find dataflow ID by name
WS_ID="<workspaceId>"
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/dataflows" \
--query "value[?displayName=='MyDataflow'].id" --output tsv
WS_ID="<workspaceId>"
DF_ID="<dataflowId>"
API="https://api.fabric.microsoft.com/v1"
RESOURCE="https://api.fabric.microsoft.com"
mashup.pq), update queriesMetadata, prepare all 3 definition parts.# 1. Discover — get current definition
RESULT=$(az rest --method post \
--resource "$RESOURCE" \
--url "$API/workspaces/$WS_ID/dataflows/$DF_ID/getDefinition")
# Decode mashup.pq to inspect current queries
echo "$RESULT" | jq -r '.definition.parts[] | select(.path=="mashup.pq") | .payload' | base64 -d
# 2. Formulate — edit M code, prepare new definition parts
# (modify the decoded mashup, re-encode, build JSON payload)
# 3. Execute — update definition
az rest --method post \
--resource "$RESOURCE" \
--url "$API/workspaces/$WS_ID/dataflows/$DF_ID/updateDefinition?updateMetadata=true" \
--body @definition.json
# 4. Verify — trigger refresh and poll
LOCATION=$(az rest --method post \
--resource "$RESOURCE" \
--url "$API/workspaces/$WS_ID/items/$DF_ID/jobs/Execute/instances" \
--headers "Content-Length=0" \
--output none --include-response-headers 2>&1 | grep -i "^location:" | awk '{print $2}' | tr -d '\r')
For full authoring gotchas: DATAFLOWS-AUTHORING-CORE.md Gotchas and Troubleshooting. For CLI-specific issues: COMMON-CLI.md Gotchas & Troubleshooting (CLI-Specific).
az login first — all az rest calls use the active session. No session → 401.--resource "https://api.fabric.microsoft.com" — wrong audience = 401.payloadType: "InlineBase64".getDefinition, updateDefinition (with def), and job execution return 202; poll via Location header.GET /v1/workspaces/{id} and check capacityId.formatVersion to "202502" in queryMetadata.json.loadEnabled: true for queries that should write output on refresh.connectionId will fail at refresh.getDefinition — it's a POST endpoint; GET returns 405.Location header from 202 responses.displayName values — not enforced but causes confusion.az rest over raw curl — handles token acquisition and refresh automatically.getDefinition before updateDefinition — read-modify-write prevents accidental data loss.jq for JSON manipulation — build definition payloads programmatically."Automatic" for parameter type in job execution — lets engine infer from definition.?updateMetadata=true on updateDefinition — ensures .platform changes (display name) are applied.WS_ID, DF_ID, API, RESOURCE) for script reuse.| Symptom | Fix |
|---|---|
| 401 Unauthorized | Verify az login is active; check --resource "https://api.fabric.microsoft.com" |
| 405 Method Not Allowed on getDefinition | Use POST, not GET — getDefinition follows LRO pattern |
| updateDefinition silently drops queries | Send all 3 parts (queryMetadata.json, mashup.pq, .platform) |
| Refresh fails on new dataflow | Ensure connectionId values exist in user's Fabric connection store |
| formatVersion mismatch error | Set formatVersion to "202502" in queryMetadata.json |
| Query doesn't produce output | Set loadEnabled: true in queriesMetadata |
| Fast copy not engaged | Add [StagingDefinition = [Kind = "FastCopy"]] before section in mashup.pq |
| LRO polling returns 404 | Use Location header URL — don't construct operation URLs manually |
| 429 Too Many Requests | Respect Retry-After header; implement exponential backoff |
| base64 decode produces garbage | Ensure no trailing newlines; use base64 -w0 (Linux) for encoding |
# 1. Prepare mashup.pq
MASHUP='section Section1;
shared Output = let
Source = #table(type table [Name = text, Value = number],
{{"Alpha", 100}, {"Beta", 200}, {"Gamma", 300}})
in Source;'
# 2. Prepare queryMetadata.json
QUERY_META='{
"formatVersion": "202502",
"queriesMetadata": {
"Output": { "queryId": "00000000-0000-0000-0000-000000000001",
"queryName": "Output", "loadEnabled": true }
}
}'
# 3. Prepare .platform
PLATFORM='{"$schema":"https://developer.microsoft.com/json-schemas/fabric/gitIntegration/platformProperties/2.0.0/schema.json","metadata":{"type":"Dataflow","displayName":"MyDataflow"},"config":{"version":"2.0","logicalId":"00000000-0000-0000-0000-000000000001"}}'
# 4. Base64-encode all parts
MASHUP_B64=$(echo -n "$MASHUP" | base64 -w0)
META_B64=$(echo -n "$QUERY_META" | base64 -w0)
PLAT_B64=$(echo -n "$PLATFORM" | base64 -w0)
# 5. Create the dataflow
az rest --method post \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Type=application/json" \
--body "{
\"displayName\": \"MyDataflow\",
\"type\": \"Dataflow\",
\"definition\": {
\"parts\": [
{\"path\": \"mashup.pq\", \"payload\": \"${MASHUP_B64}\", \"payloadType\": \"InlineBase64\"},
{\"path\": \"queryMetadata.json\", \"payload\": \"${META_B64}\", \"payloadType\": \"InlineBase64\"},
{\"path\": \".platform\", \"payload\": \"${PLAT_B64}\", \"payloadType\": \"InlineBase64\"}
]
}
}"
# Trigger refresh (returns 202 + Location header for polling)
LOCATION=$(az rest --method post \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/jobs/instances?jobType=Pipeline" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Length=0" \
--output none --include-response-headers 2>&1 | grep -i "^location:" | awk '{print $2}' | tr -d '\r')
# Poll until complete
while true; do
STATUS=$(az rest --method get --url "$LOCATION" \
--resource "https://api.fabric.microsoft.com" --query "status" -o tsv)
echo "Status: $STATUS"
[[ "$STATUS" == "Completed" || "$STATUS" == "Failed" ]] && break
sleep 10
done
# Get current definition (POST, not GET — returns 202)
LOCATION=$(az rest --method post \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/getDefinition" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Length=0" \
--output none --include-response-headers 2>&1 | grep -i "^location:" | awk '{print $2}' | tr -d '\r')
# Poll for the definition
DEF=$(az rest --method get --url "${LOCATION}" \
--resource "https://api.fabric.microsoft.com")
# Decode, modify, re-encode, then updateDefinition with all 3 parts
az rest --method post \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/updateDefinition" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Type=application/json" \
--body "{\"definition\": {\"parts\": [ ... ]}}"