From fabric-skills
Monitors, inspects, and discovers Fabric Dataflows Gen2 via read-only CLI (az rest/curl). Lists dataflows, decodes base64 definitions for Power Query M and metadata, discovers parameters, polls refresh status, retrieves job history.
npx claudepluginhub microsoft/skills-for-fabric --plugin skills-for-fabricThis skill uses the workspace's default tool permissions.
> **Update Check — ONCE PER SESSION (mandatory)**
Authors, updates, deletes, and manages Fabric Dataflows Gen2 via CLI using az rest/curl with base64-encoded Power Query M mashup definitions. Supports refreshes, connections, CI/CD exports.
Guides on Azure Data Factory validation rules including activity nesting limitations, ForEach restrictions, pipeline validation, linked service authentication, resource limits, Set Variable rules, and Data Flow constraints.
Monitors Power Automate flow health, tracks failure rates, and inventories tenant assets using FlowStudio MCP cached store. Provides aggregated stats, per-run failure details with remediation hints, maker activity, and Power Apps inventory without API rate limits.
Share bugs, ideas, or general feedback.
Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
check-updatesskill.- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find a dataflow by name: list all dataflows in the workspace and filter by
displayNameclient-side — there is no server-side name filtergetDefinitionis a POST, not GET — even though it reads data
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items in Fabric | Mandatory — READ link first |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology & Key Concepts | |
| Environment URLs | COMMON-CORE.md § Environment URLs | |
| Authentication & Token Acquisition | COMMON-CORE.md § Authentication & Token Acquisition | Wrong audience = 401; read before any auth issue |
| Core Control-Plane REST APIs | COMMON-CORE.md § Core Control-Plane REST APIs | Includes pagination, LRO polling, and rate-limiting patterns |
| Job Execution | COMMON-CORE.md § Job Execution | |
| Gotchas, Best Practices & Troubleshooting | COMMON-CORE.md § Gotchas, Best Practices & Troubleshooting | |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection Rationale | |
| Authentication Recipes | COMMON-CLI.md § Authentication Recipes | az login flows and token acquisition |
Fabric Control-Plane API via az rest | COMMON-CLI.md § Fabric Control-Plane API via az rest | Always pass --resource; includes pagination and LRO helpers |
| Job Execution (CLI) | COMMON-CLI.md § Job Execution | |
| Gotchas & Troubleshooting (CLI-Specific) | COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific) | az rest audience, shell escaping, token expiry |
| Quick Reference | COMMON-CLI.md § Quick Reference | az rest template + token audience/tool matrix |
| Consumption Capability Matrix | DATAFLOWS-CONSUMPTION-CORE.md § Consumption Capability Matrix | Read first — shows what ops are available |
| REST API Surface (Consumption) | DATAFLOWS-CONSUMPTION-CORE.md § REST API Surface | List, Get, Parameters, getDefinition, Jobs |
| Dataflow Definition Exploration | DATAFLOWS-CONSUMPTION-CORE.md § Dataflow Definition Exploration | Decode mashup.pq, queryMetadata.json, .platform |
| Parameter Discovery and Analysis | DATAFLOWS-CONSUMPTION-CORE.md § Parameter Discovery and Analysis | Types, formats, M code patterns |
| Refresh and Job Monitoring | DATAFLOWS-CONSUMPTION-CORE.md § Refresh and Job Monitoring | LRO pattern, job instances, polling best practices |
| Agentic Exploration Pattern | DATAFLOWS-CONSUMPTION-CORE.md § Agentic Exploration Pattern | 6-step discovery sequence |
| Security and Permissions Model | DATAFLOWS-CONSUMPTION-CORE.md § Security and Permissions Model | Permission matrix by operation |
| Common Errors | DATAFLOWS-CONSUMPTION-CORE.md § Common Errors | Error codes and resolutions |
| Gotchas and Troubleshooting Reference | DATAFLOWS-CONSUMPTION-CORE.md § Gotchas and Troubleshooting | 12 numbered issues with cause + resolution |
| Quick Reference One-Liners | consumption-cli-quickref.md | az rest one-liners for all consumption ops |
| Discovery Patterns | discovery-queries.md | Definition decoding, parameter extraction, connection analysis |
| Script Templates | script-templates.md | Copy-paste bash and PowerShell templates |
| Tool Stack | SKILL.md § Tool Stack | |
| Connection | SKILL.md § Connection | |
| Agentic Exploration ("Chat With My Dataflows") | SKILL.md § Agentic Exploration | Start here for dataflow exploration |
| Tool | Role | Install |
|---|---|---|
az CLI | Primary: Auth (az login), Fabric REST API via az rest | Pre-installed in most dev environments |
curl | Alternative HTTP client for REST calls | Pre-installed |
jq | Parse JSON responses, extract fields, format output | Pre-installed or trivial |
base64 | Decode definition parts from base64 | Built into bash; PowerShell uses [Convert]::FromBase64String |
bash/pwsh | Script execution | Pre-installed |
Agent check — verify before first operation:
az account show >/dev/null 2>&1 || echo "RUN: az login" command -v jq >/dev/null 2>&1 || echo "INSTALL: apt-get install jq OR brew install jq"
Per COMMON-CLI.md Finding Workspaces and Items in Fabric:
# Find workspace ID by name
WS_ID=$(az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces" \
--query "value[?displayName=='My Workspace'].id" --output tsv)
# Find dataflow ID by name within workspace
DF_ID=$(az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/dataflows" \
--query "value[?displayName=='Sales Data Pipeline'].id" --output tsv)
# Set once at script top
WS_ID="<workspaceId>"
DF_ID="<dataflowId>"
API="https://api.fabric.microsoft.com/v1"
AZ="az rest --resource https://api.fabric.microsoft.com"
Run these in order to fully explore a workspace's dataflows. See references/discovery-queries.md for extended patterns.
# 1. List workspaces → find target
az rest --method get --resource "https://api.fabric.microsoft.com" \
--url "$API/workspaces" --query "value[].{name:displayName, id:id}" -o table
# 2. List dataflows → enumerate all
az rest --method get --resource "https://api.fabric.microsoft.com" \
--url "$API/workspaces/$WS_ID/dataflows" \
--query "value[].{name:displayName, id:id, desc:description}" -o table
# 3. Get dataflow properties
az rest --method get --resource "https://api.fabric.microsoft.com" \
--url "$API/workspaces/$WS_ID/dataflows/$DF_ID"
# 4. Discover parameters
az rest --method get --resource "https://api.fabric.microsoft.com" \
--url "$API/workspaces/$WS_ID/dataflows/$DF_ID/parameters" \
--query "value[].{name:name, type:type, required:isRequired, default:defaultValue}" -o table
# 5. Get definition → decode mashup.pq
RESPONSE=$(az rest --method post --resource "https://api.fabric.microsoft.com" \
--url "$API/workspaces/$WS_ID/dataflows/$DF_ID/getDefinition")
echo "$RESPONSE" | jq -r '.definition.parts[] | select(.path=="mashup.pq") | .payload' | base64 --decode
# 6. Check job history
az rest --method get --resource "https://api.fabric.microsoft.com" \
--url "$API/workspaces/$WS_ID/items/$DF_ID/jobs/instances" \
--query "value[].{status:status, type:invokeType, start:startTimeUtc, end:endTimeUtc, error:failureReason}" -o table
For full platform gotchas: DATAFLOWS-CONSUMPTION-CORE.md Gotchas and Troubleshooting Reference and COMMON-CLI.md Gotchas & Troubleshooting (CLI-Specific).
az login first — az rest uses the active session. No session → cryptic failure.--resource "https://api.fabric.microsoft.com" — wrong audience = 401.continuationToken until absent/null.getDefinition — may return 202 Accepted with Location header; poll until complete.getDefinition — it is NOT a GET endpoint.getDefinition is GET — it is POST (common mistake).Retry-After headers on 429s.getDefinition with Viewer role — requires Read+Write (Contributor+).az rest over raw curl — handles auth automatically.jq for response parsing — cleaner than shell string manipulation.--query for simple field extraction directly in az rest.WS_ID, DF_ID, API) for script reuse.| Symptom | Cause | Fix |
|---|---|---|
401 Unauthorized | Token expired or wrong audience | az login; ensure --resource "https://api.fabric.microsoft.com" |
403 Forbidden on getDefinition | Viewer role (Read-only) | Requires Contributor role or higher (Read+Write) |
404 Not Found | Wrong workspace or dataflow ID | Re-discover via List Dataflows API |
getDefinition returns 202 | Large definition or server load | Poll the Location header URL until operation completes |
| Empty parameters array | Dataflow has no parameters | Expected behavior — check mashup.pq for IsParameterQuery |
| Base64 decode shows garbled text | BOM in encoded content | Strip UTF-8 BOM (\xEF\xBB\xBF) when decoding |
429 TooManyRequests | Rate limited | Respect Retry-After header; implement exponential backoff |
| Duplicate results in list | Re-using stale continuationToken | Always use the token from the most recent response |
OperationNotSupportedForItem | Wrong item type | Verify item is type Dataflow via Get Item |
az rest --method get \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items?type=Dataflow" \
--resource "https://api.fabric.microsoft.com" \
--query "value[].{Name:displayName, Id:id, Type:type}" -o table
# Step 1: Request definition (POST — returns 202 with Location header)
LOCATION=$(az rest --method post \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/getDefinition" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Length=0" \
--output none --include-response-headers 2>&1 | grep -i "^location:" | awk '{print $2}' | tr -d '\r')
# Step 2: Poll until definition is ready
DEF=$(az rest --method get --url "${LOCATION}" \
--resource "https://api.fabric.microsoft.com")
# Step 3: Decode mashup.pq to see the Power Query M code
echo "$DEF" | python3 -c "
import json, base64, sys
parts = json.load(sys.stdin)['definition']['parts']
for p in parts:
if p['path'] == 'mashup.pq':
print(base64.b64decode(p['payload']).decode('utf-8'))
"
# Get recent job instances for a dataflow
az rest --method get \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WS_ID}/items/${DF_ID}/jobs/instances?limit=5" \
--resource "https://api.fabric.microsoft.com" \
--query "value[].{Status:status, Start:startTimeUtc, End:endTimeUtc, Id:id}" -o table
# After decoding the definition (see Example 2), extract parameters:
echo "$DEF" | python3 -c "
import json, base64, sys
parts = json.load(sys.stdin)['definition']['parts']
for p in parts:
if p['path'] == 'queryMetadata.json':
meta = json.loads(base64.b64decode(p['payload']).decode('utf-8'))
for qname, qmeta in meta.get('queriesMetadata', {}).items():
if qmeta.get('queryGroupId') == 'parameters' or 'IsParameterQuery' in str(qmeta):
print(f'Parameter: {qname}')
"