From fabric-skills
Scans Power BI/Fabric workspaces or tenants for Gen1 dataflows, assesses save-as readiness via seven risk signals, generates snapshots, and executes Gen2.1 upgrades via CLI REST APIs.
npx claudepluginhub microsoft/skills-for-fabric --plugin skills-for-fabricThis skill uses the workspace's default tool permissions.
> **Update Check — ONCE PER SESSION (mandatory)**
Authors, updates, deletes, and manages Fabric Dataflows Gen2 via CLI using az rest/curl with base64-encoded Power Query M mashup definitions. Supports refreshes, connections, CI/CD exports.
Guides on Azure Data Factory validation rules including activity nesting limitations, ForEach restrictions, pipeline validation, linked service authentication, resource limits, Set Variable rules, and Data Flow constraints.
Govern Power Automate flows and Power Apps at scale using FlowStudio MCP: classify by business impact, detect orphaned resources, audit connectors, enforce compliance, manage notifications, compute governance scores.
Share bugs, ideas, or general feedback.
Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
check-updatesskill.- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering
A save-as companion for creating upgraded Gen2.1 copies from Power BI Gen1 dataflows using readiness assessment and guarded execution.
We currently cannot perform an in-place migration of your dataflow. We can use save-as to create an upgraded Gen2.1 copy while preserving the original Gen1 dataflow.
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items | Mandatory — READ link first |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology | |
| Environment URLs | COMMON-CORE.md § Environment URLs | |
| Authentication & Token Acquisition | COMMON-CORE.md § Authentication | Wrong audience = 401 |
| Core Control-Plane REST APIs | COMMON-CORE.md § Core REST APIs | Pagination, LRO polling, rate limits |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection | |
| Authentication Recipes | COMMON-CLI.md § Auth Recipes | az login flows and token acquisition |
Fabric Control-Plane API via az rest | COMMON-CLI.md § az rest | Always pass --resource |
| Gotchas & Troubleshooting (CLI) | COMMON-CLI.md § Gotchas | az rest audience, shell escaping |
| Quick Reference | COMMON-CLI.md § Quick Ref | Token audience / tool matrix |
| Dataflow Definition Structure | DATAFLOWS-AUTHORING-CORE.md § Definition | 3-part format for Gen2 CI/CD |
| Consumption Capability Matrix | DATAFLOWS-CONSUMPTION-CORE.md § Capabilities | Read-only discovery patterns |
| Upgrade CLI Quick Reference | upgrade-cli-quickref.md | All az rest one-liners for scanning & save-as |
| Risk Assessment Guide | risk-assessment-guide.md | Risk signal detection logic & API calls |
| Tool | Role | Install |
|---|---|---|
az CLI | Primary: Auth (az login), REST API calls (az rest) against both Fabric and Power BI APIs. | Pre-installed in most dev environments |
jq | Parse and filter JSON responses (dataflow lists, risk signal extraction). | Pre-installed or trivial |
base64 | Decode dataflow definitions for inspection. | Built into bash / [Convert]::ToBase64String() in PowerShell |
Agent check — verify before first operation:
az --version 2>/dev/null || echo "INSTALL: https://aka.ms/install-azure-cli" jq --version 2>/dev/null || echo "INSTALL: apt-get install jq OR brew install jq"
This skill uses two distinct API audiences. Using the wrong audience returns 401.
| API | Audience (--resource) | Use For |
|---|---|---|
| Fabric Items API | https://api.fabric.microsoft.com | List Gen2 dataflows (Fabric-native), workspace discovery |
| Power BI REST API | https://analysis.windows.net/powerbi/api | Gen1 dataflow discovery, saveAsNativeArtifact, data sources, upstream dataflows, Admin API scanning |
# Fabric Items API — list Gen2 dataflows in a workspace
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/dataflows"
# Power BI REST API — list all dataflows (Gen1 + Gen2) in a workspace
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/dataflows"
# Power BI Admin API — list all dataflows tenant-wide (requires admin role)
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/admin/dataflows"
Goal: "Should I use save-as, and what will happen when I create a Gen2.1 copy?"
Follow this sequence for every save-as assessment:
The Power BI REST API returns a generation property (value 1 or 2) on each dataflow. This is the preferred detection method — a single API call per workspace.
WS_ID="<workspaceId>"
RESOURCE_PBI="https://analysis.windows.net/powerbi/api"
# List all dataflows — the `generation` property distinguishes Gen1 from Gen2
ALL_DATAFLOWS=$(az rest --method get \
--resource "$RESOURCE_PBI" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/dataflows" \
--query "value[].{id:objectId, name:name, generation:generation, modelUrl:modelUrl, configuredBy:configuredBy}" -o json)
# Filter Gen1 dataflows
echo "$ALL_DATAFLOWS" | jq '[.[] | select(.generation == 1)]'
# Filter Gen2 dataflows
echo "$ALL_DATAFLOWS" | jq '[.[] | select(.generation == 2)]'
Tip: A
modelUrlpointing todfs.core.windows.netadditionally indicates BYOSA (customer-managed storage) — a save-as blocker.
Requires Fabric administrator role or service principal with Tenant.Read.All scope. Rate limited to 200 requests/hour.
RESOURCE_PBI="https://analysis.windows.net/powerbi/api"
# List ALL dataflows in the tenant
ADMIN_DATAFLOWS=$(az rest --method get \
--resource "$RESOURCE_PBI" \
--url "https://api.powerbi.com/v1.0/myorg/admin/dataflows" \
--query "value[].{id:objectId, name:name, workspaceId:workspaceId, modelUrl:modelUrl, configuredBy:configuredBy}" \
-o json)
# Filter Gen1 dataflows — those with a modelUrl indicate CDM/Gen1 storage
# Note: Admin API may not expose the `generation` property; use modelUrl as fallback
echo "$ADMIN_DATAFLOWS" | jq '[.[] | select(.modelUrl != null and .modelUrl != "")]'
Note: The Admin API supports
$filter,$top, and$skipfor pagination on large tenants.
For each Gen1 dataflow found, evaluate seven risk signals. See risk-assessment-guide.md for detailed API calls.
| # | Risk Signal | Detection Method | Impact |
|---|---|---|---|
| 1 | Incremental refresh | Check dataflow definition for incremental refresh policy configuration | ⚠️ Schedule migrates in disabled state; must re-enable and validate |
| 2 | BYOSA / Custom ADLS Gen2 storage | Check modelUrl — if points to customer storage account (not Power BI managed) | ❌ Data stays in old storage; Gen2 CI/CD uses Fabric-managed storage |
| 3 | Power Automate / API triggers | Check for external orchestration referencing the Gen1 dataflow ID | ⚠️ All integrations must update to new Gen2 artifact ID |
| 4 | Downstream pipeline dependencies | Check Fabric pipelines for dataflow activity references | ⚠️ Pipeline activities reference dataflow by ID; must re-bind |
| 5 | Linked / computed entities | Inspect dataflow definition for entity references to other dataflows | ⚠️/❌ Cross-dataflow references may break if source dataflows are not saved first |
| 6 | DirectQuery connections | Inspect data source types in definition | ❌ DirectQuery not supported in Gen2 CI/CD dataflows |
| 7 | Caller is not owner / insufficient role | Compare configuredBy against az account show --query user.name -o tsv — or attempt call and catch DataflowUnauthorizedError | ❌ saveAsNativeArtifact requires the caller to be the dataflow owner or have Contributor/Admin in the source workspace; Viewer/Member without ownership cannot execute save-as |
| Category | Criteria | Action |
|---|---|---|
| ✅ Safe | No risk signals detected | Create a Gen2.1 save-as copy with saveAsNativeArtifact |
| ⚠️ Manual followups | Risk signals 1, 3, 4, or 5 (non-blocking) | Execute save-as, then remediate flagged issues |
| ❌ Blocked | Risk signals 2, 6, or 7 (blocking) | Cannot execute save-as until blocker is resolved |
Tip — detect ownership before save-as: The
configuredByfield in the dataflow list response contains the owner's email. Compare it against the currently logged-in user (az account show --query user.name -o tsv). If they don't match and your workspace role is below Contributor, flag the dataflow as ❌ Blocked (signal 7) and escalate to the owner.
## Save-As Readiness Snapshot
| Workspace | Dataflow | Type | Readiness | Risk Signals | Recommendation |
|---|---|---|---|---|---|
| Sales Analytics | SalesETL | Gen1 | ✅ Safe | None | Save as Gen2.1 copy now |
| Sales Analytics | CustomerLoad | Gen1 | ⚠️ Manual | Incremental refresh, Pipeline dep | Save as Gen2.1 copy, then re-enable schedule & update pipeline |
| Finance | FinanceDaily | Gen1 | ❌ Blocked | BYOSA storage | Resolve storage dependency first |
{
"snapshotDate": "2025-04-13T10:00:00Z",
"summary": { "total": 3, "safe": 1, "manual": 1, "blocked": 1 },
"dataflows": [
{
"workspaceName": "Sales Analytics",
"workspaceId": "...",
"dataflowName": "SalesETL",
"dataflowId": "...",
"type": "Gen1",
"readiness": "safe",
"riskSignals": [],
"recommendation": "Save as Gen2.1 copy now",
"saveAsPath": "saveAsNativeArtifact"
}
]
}
Save JSON to file: pipe to
jq '.' > readiness-snapshot.json
Goal: Invoke save-as and capture outcomes safely.
saveAsNativeArtifact APIPOST https://api.powerbi.com/v1.0/myorg/groups/{groupId}/dataflows/{gen1DataflowId}/saveAsNativeArtifact
This is a Preview API. It creates a new Gen2.1 CI/CD artifact copy while preserving the original Gen1 dataflow.
WS_ID="<workspaceId>"
GEN1_ID="<gen1DataflowId>"
# Write body to a temp file — az rest wraps inline --body in an envelope
# on some platforms, causing "saveAsRequest is a required parameter" errors.
cat > /tmp/save-as-body.json <<'EOF'
{
"displayName": "MyDataflow_Gen2CICD",
"description": "Saved as Gen2.1 copy from Gen1",
"includeSchedule": true,
"targetWorkspaceId": "<targetWorkspaceId>"
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/dataflows/$GEN1_ID/saveAsNativeArtifact" \
--headers "Content-Type=application/json" \
--body @/tmp/save-as-body.json
Gotcha — inline body: Passing JSON inline via
--body '{...}'can causeaz restto wrap the payload in an extra envelope, resulting in"saveAsRequest is a required parameter"errors. Always use file-based body (--body @file.json) for this endpoint.
Gotcha — Windows
az.cmd: On Windows, omit-o jsonfromsaveAsNativeArtifactcalls — the flag produces"A value that is not valid (json) was specified for the outputFormat parameter"when routed throughaz.cmd. Capture output without-o jsonand parse withConvertFrom-Jsonin PowerShell, or pipe tojqin bash.
Gotcha — not idempotent (duplicate artifacts on retry):
saveAsNativeArtifactcreates a new artifact every time it is called. If a batch is interrupted and re-run, you will end up with multiple copies in the target workspace. To make retries safe: (1) check whether a Gen2 artifact with the intended name already exists before calling, or (2) include a timestamp indisplayNameand treat each run as a distinct artifact.
Gotcha — owner permissions: You must be the dataflow owner or have Contributor/Admin in the source workspace to call
saveAsNativeArtifact. If you are only a Viewer or Workspace Member who does not own the dataflow, the API returnsDataflowUnauthorizedError. Ask the dataflow owner or a workspace admin to run the save-as operation for those dataflows.
Request parameters:
| Parameter | Type | Required | Description |
|---|---|---|---|
displayName | string (max 200) | No | Name for new artifact. Auto-generated with _copy1 suffix if omitted |
description | string (max 4000) | No | Description. Copied from source if omitted |
includeSchedule | boolean | No | Copy refresh schedule in disabled state |
targetWorkspaceId | string (uuid) | No | Target workspace. Same workspace if omitted |
Response: 200 OK with SaveAsNativeDataflowResponse:
artifactMetadata — full metadata of the new Gen2 CI/CD artifact (including objectId, provisionState)errors[] — non-fatal warning codes (save-as succeeds even if these occur):
FailedToCopySchedule — schedule could not be copiedSetDataflowOriginFailed — origin tracking not setConnectionsUpdateFailed — connection strings could not be updated to Fabric formatNOT YET AVAILABLE — This API is not available in the current public surface. This skill will be updated when the endpoint is published. Do not attempt to call a non-existent endpoint.
Run these checks after save-as before any Gen1 cleanup:
artifactMetadata.provisionState reaches Active.errors[] in SaveAsNativeDataflowResponse and create follow-up tasks for each warning.--resource to az rest — use the correct audience per the API table above. Wrong audience = silent 401.--headers "Content-Type=application/json" on POST calls to the Power BI REST API.saveAsNativeArtifact — pass --body @file.json instead of inline JSON. Inline --body '{...}' can cause az rest to wrap the payload in an extra envelope, producing "saveAsRequest is a required parameter" errors.-o json on saveAsNativeArtifact calls — use ConvertFrom-Json in PowerShell or pipe to jq instead. The -o json flag fails with "A value that is not valid (json)" error when routed through az.cmd.saveAsNativeArtifact returns DataflowUnauthorizedError for non-owners who are only Workspace Members or Viewers.saveAsNativeArtifact is not idempotent; interrupted batch runs create duplicate copies on retry. Either verify the target name is absent before calling, or use a unique timestamped displayName per run.errors[] in saveAsNativeArtifact response — save-as may succeed with non-fatal warnings.provisionState is Active after save-as — poll the artifact metadata until terminal state.saveAsNativeArtifact leaves the Gen1 intact. Do not delete it until post-save-as validation passes.generation property for Gen1 detection — the Power BI REST API returns generation: 1 or generation: 2 on each dataflow. Use this single-API-call approach over cross-API comparison.modelUrl check as secondary signal — dataflows with a modelUrl pointing to dfs.core.windows.net are Gen1 using customer storage (BYOSA blocker).FailedToCopySchedule, ConnectionsUpdateFailed, etc. require manual follow-up.-o json with saveAsNativeArtifact on Windows — omit the flag; capture raw output and parse it explicitly.Prompt: "Scan my Sales workspace for Gen1 dataflows and show me a readiness snapshot"
Agent response:
# Step 1 — Resolve workspace ID
WS_ID=$(az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces" \
--query "value[?displayName=='Sales'].id" -o tsv)
# Step 2 — List all dataflows with generation property
ALL_DFS=$(az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/dataflows" -o json)
# Step 3 — Filter Gen1 dataflows by generation property
GEN1_DFS=$(echo "$ALL_DFS" | jq '[.value[] | select(.generation == 1) |
{id: .objectId, name: .name, generation: .generation, modelUrl: .modelUrl}]')
echo "Gen1 dataflows found: $(echo $GEN1_DFS | jq length)"
echo "$GEN1_DFS" | jq '.'
Prompt: "Migrate or upgrade the SalesETL Gen1 dataflow in my Sales workspace"
Agent response:
WS_ID="<workspaceId>"
GEN1_ID="<gen1DataflowId>"
# Write body to file (avoids az rest envelope wrapping)
cat > /tmp/save-as-body.json <<'EOF'
{"displayName": "SalesETL_Gen2CICD", "includeSchedule": true}
EOF
# Execute save-as
RESPONSE=$(az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/dataflows/$GEN1_ID/saveAsNativeArtifact" \
--headers "Content-Type=application/json" \
--body @/tmp/save-as-body.json)
# Check result
echo "$RESPONSE" | jq '{
newId: .artifactMetadata.objectId,
state: .artifactMetadata.provisionState,
errors: .errors
}'
Prompt: "Show me all Gen1 dataflows across the entire tenant"
Agent response:
# Requires Fabric administrator role
ALL_DFS=$(az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/admin/dataflows" -o json)
# Gen1 dataflows have a modelUrl
echo "$ALL_DFS" | jq '[.value[] | select(.modelUrl != null and .modelUrl != "") |
{id: .objectId, name: .name, workspace: .workspaceId,
owner: .configuredBy, storage: .modelUrl}]'