From fabric-consumption
Manages Power BI semantic models in Fabric workspaces via az rest CLI: create from TMDL, download/update definitions, refresh datasets, configure permissions, deploy across pipelines.
npx claudepluginhub microsoft/skills-for-fabric --plugin fabric-consumptionThis skill uses the workspace's default tool permissions.
> **Update Check — ONCE PER SESSION (mandatory)**
Authors Power BI semantic models: creates/edits DAX measures/tables/relationships, analyzes best practices, deploys to Fabric, manages PBIP/TMDL, troubleshoots performance.
Executes read-only DAX queries on Microsoft Fabric Power BI semantic models via MCP ExecuteQuery tool to discover metadata (tables, columns, measures, relationships) and retrieve data.
Enables programmatic Power BI report and semantic model development using PBIR/PBIP formats, TOM/.NET SDK, TMSL/TMDL, pbi-tools, and ALM Toolkit for CI/CD and code-first workflows.
Share bugs, ideas, or general feedback.
Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
check-updatesskill.- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items in Fabric | Mandatory — READ link first [needed for finding workspace id by its name or item id by its name, item type, and workspace id] |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology & Key Concepts | Hierarchy; Finding Things in Fabric |
| Environment URLs | COMMON-CORE.md § Environment URLs | Production (Public Cloud) |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection Rationale | |
| Authentication Recipes | COMMON-CLI.md § Authentication Recipes | az login flows, environment detection, token acquisition, and debugging |
Fabric Control-Plane API via az rest | COMMON-CLI.md § Fabric Control-Plane API via az rest | Always pass --resource; includes workspace/item operations, pagination, and LRO patterns |
OneLake Data Access via curl | COMMON-CLI.md § OneLake Data Access via curl | Use curl not az rest (different token audience) |
| Job Execution (CLI) | COMMON-CLI.md § Job Execution | Run notebooks/pipelines, refresh semantic models, check/cancel jobs |
| OneLake Shortcuts | COMMON-CLI.md § OneLake Shortcuts | Create a Shortcut; List Shortcuts; Delete a Shortcut |
| Capacity Management (CLI) | COMMON-CLI.md § Capacity Management | List Capacities; Assign Workspace to Capacity |
| Composite Recipes | COMMON-CLI.md § Composite Recipes | End-to-end workspace→lakehouse→file, SQL endpoint→query, and notebook execution recipes |
| Gotchas & Troubleshooting (CLI-Specific) | COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific) | az rest audience, shell escaping, token expiry |
| Quick Reference | COMMON-CLI.md § Quick Reference | az rest Template; Token Audience ↔ CLI Tool Matrix |
| DAX Queries & Metadata Discovery | powerbi-consumption-cli | Read-only DAX queries; use for post-creation validation |
| Tool Stack | SKILL.md § Tool Stack | az rest (primary), jq (JSON parsing), base64 encoding |
| Authentication & API Audiences | SKILL.md § Authentication & API Audiences | Two audiences: Fabric API vs Power BI Datasets API |
| Must/Prefer/Avoid | SKILL.md § Must/Prefer/Avoid | Guardrails for semantic model authoring |
| SemanticModel Definition & Envelope | ITEM-DEFINITIONS-CORE.md § SemanticModel | TMDL format; required parts, envelope structure, support matrix |
| TMDL File Structure & Examples | SKILL.md § TMDL File Structure | Required parts, minimal content examples |
| TMDL CRUD (Create / Get / Update) | SKILL.md § Create Semantic Model | Create → Get/Download → Update; full lifecycle with LRO |
| Authoring Scope Matrix | SKILL.md § Authoring Scope Matrix | What Fabric API supports vs what to avoid |
| Refresh Operations | SKILL.md § Refresh Operations | Trigger, cancel, history, schedule (Power BI API) |
| Data Sources & Parameters | SKILL.md § Data Sources & Parameters | Get/update data sources and parameters |
| Permissions | SKILL.md § Permissions | Grant/update dataset user permissions |
| Deployment Pipelines | SKILL.md § Deployment Pipelines | List, get stages, deploy between stages |
| Agentic Workflow | SKILL.md § Agentic Workflow | Step-by-step: discover → create → verify → refresh → validate |
| Troubleshooting | SKILL.md § Troubleshooting | Common errors table: LRO, auth, TMDL encoding, refresh |
| Examples | SKILL.md § Examples | Create model, download definition, refresh, deploy |
| Property-to-API Mapping | semantic-model-properties-guide.md § Property-to-API Mapping | Maps each property category to the correct API surface |
| Owner, Storage Mode & Operational Metadata | semantic-model-properties-guide.md § Owner, Storage Mode | Power BI Datasets API properties |
| Refresh History Response Properties | semantic-model-properties-guide.md § Refresh History | Refresh detail response fields |
| Data Source Response Properties | semantic-model-properties-guide.md § Data Sources | Connection and gateway properties |
| DirectQuery / LiveConnection Refresh Schedule | semantic-model-properties-guide.md § DQ Refresh Schedule | DirectQuery/LiveConnection schedule settings |
| Upstream Dataflow Links | semantic-model-properties-guide.md § Upstream Dataflows | Dataflow dependency properties |
| Per-Table Storage Mode | semantic-model-properties-guide.md § Per-Table Storage | Table-level storage mode via TMDL |
| TMDL Syntax Rules | tmdl-authoring-guide.md § TMDL Syntax Rules | Tab indentation, object declaration, quoting rules |
| Modeling Best Practices | tmdl-authoring-guide.md § Modeling Best Practices | Naming conventions, column rules, measure & DAX rules, format strings |
| Relationships | tmdl-authoring-guide.md § Relationships | Relationship declarations, key rules |
| Hierarchies | tmdl-authoring-guide.md § Hierarchies | Hierarchy declarations and key rules |
| Direct Lake Guidelines | tmdl-authoring-guide.md § Direct Lake Guidelines | Direct Lake mode configuration and constraints |
| Calculated Tables | tmdl-authoring-guide.md § Calculated Tables | DAX-based calculated table definitions |
| Date/Calendar Table | tmdl-authoring-guide.md § Date/Calendar Table | Calendar table setup and marking |
| Parameters | tmdl-authoring-guide.md § Parameters | Expression-based parameter declarations |
| Annotations | tmdl-authoring-guide.md § Annotations | Model and object-level annotations |
| TMDL File Layout & Core Files | tmdl-advanced-features-guide.md § File Layout | Directory structure, database.tmdl, model.tmdl |
| Calculation Groups | tmdl-advanced-features-guide.md § Calculation Groups | Calculation group tables and items |
| Security Roles | tmdl-advanced-features-guide.md § Security Roles | RLS/OLS role definitions |
| Security Role Memberships | SKILL.md § Security Role Memberships | Add/list/delete users & groups in RLS roles (Power BI API) |
| Translations / Cultures | tmdl-advanced-features-guide.md § Translations / Cultures | Localization via culture files |
| Perspectives | tmdl-advanced-features-guide.md § Perspectives | Perspective definitions for subset views |
| Functions | tmdl-advanced-features-guide.md § Functions | User-defined DAX functions in the model |
| Calendar Objects | tmdl-advanced-features-guide.md § Calendar Objects | Auto date/time calendar table objects |
| Tool | Role | Install |
|---|---|---|
az CLI | Primary: az rest for Fabric and Power BI REST API calls, az login for auth. | Pre-installed in most dev environments |
jq | Parse JSON from az rest responses | Pre-installed or trivial |
base64 (Linux/macOS) / [Convert]::ToBase64String (PowerShell) | Encode TMDL file content for definition payloads | Built-in |
Agent check — verify before first operation:
az version 2>/dev/null || echo "INSTALL: https://learn.microsoft.com/cli/azure/install-azure-cli"
This skill uses two distinct API audiences. Using the wrong audience returns a 401.
| API | Audience (--resource) | Use For |
|---|---|---|
| Fabric Items API | https://api.fabric.microsoft.com | Create/get/update/delete semantic model definitions, list items, LRO polling |
| Power BI Datasets API | https://analysis.windows.net/powerbi/api | Refresh, data sources, parameters, permissions, deployment pipelines |
# Fabric Items API — semantic model definition operations
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \
...
# Power BI Datasets API — refresh, data sources, permissions
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "https://api.powerbi.com/v1.0/myorg/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \
...
--resource to az rest — omitting it causes silent auth failures. Use the correct audience per the table above.--headers "Content-Type=application/json" on POST/PATCH/PUT calls with a --body to the Power BI Datasets API — omitting it causes Unsupported Media Type errors.updateDefinition — modified + unmodified. The API replaces the entire definition; omitting parts deletes them..platform in updateDefinition payloads — it is Git integration metadata and causes errors.createItemWithDefinition, getDefinition, and updateDefinition return 202 Accepted with an Operation-Id header. Poll until terminal state.payload values in definition parts must be base64-encoded.., =, :, or ' must be wrapped in single quotes in TMDL.GET /v1/workspaces/{id} and check capacityId.createItemWithDefinition (single POST) over create-then-update for new semantic models.powerbi-modeling-mcp — for adding/modifying individual measures, columns, or relationships, the MCP server is more efficient than full definition round-trips.powerbi-consumption-cli for post-creation validation — run DAX queries to verify measures, relationships, and data.updateDefinition for small changes — a full definition round-trip is heavy; route to powerbi-modeling-mcp for individual object edits.lineageTag on new objects — TMDL auto-generates lineage tags; adding them manually causes conflicts.// comments in TMDL — not supported. Use /// descriptions instead.description property in TMDL — use /// syntax above the object instead.updateDefinition — the API replaces the full definition; missing parts are deleted.For the full definition envelope and part paths, see ITEM-DEFINITIONS-CORE.md § SemanticModel.
Required TMDL parts for createItemWithDefinition and updateDefinition:
| Part Path | Content | Required |
|---|---|---|
definition.pbism | Semantic model connection settings | Yes |
definition/database.tmdl | Database properties (compatibility level) | Yes |
definition/model.tmdl | Model properties (culture, default summarization) | Yes |
definition/tables/<TableName>.tmdl | Per-table: columns, measures, partitions | Yes (≥1) |
Critical:
updateDefinitionmust include ALL parts — modified and unmodified. The API replaces the entire definition. Never include.platformin update payloads.
For TMDL syntax rules, naming conventions, and modeling best practices, see tmdl-authoring-guide.md.
{
"version": "4.2",
"settings": {
"qnaEnabled": true
}
}
database
compatibilityLevel: 1702
compatibilityMode: powerBI
model Model
culture: en-US
defaultPowerBIDataSourceVersion: powerBI_V3
discourageImplicitMeasures
Note:
defaultPowerBIDataSourceVersion: powerBI_V3is required for Import-mode models. Without it, the API returnsImport from JSON supported for V3 models only.
table Customer
/// Total number of customers
measure '# Customers' = COUNTROWS(Customer)
formatString: #,##0
column CustomerId
dataType: int64
isHidden
isKey
summarizeBy: none
sourceColumn: CustomerId
column 'Customer Name'
dataType: string
sourceColumn: CustomerName
partition Customer = m
mode: import
source =
let
Source = Sql.Database(#"Server", #"Database"),
Customer = Source{[Schema="dbo", Item="Customer"]}[Data]
in
Customer
expression DL_Lakehouse =
let
Source = AzureStorage.DataLake("https://onelake.dfs.fabric.microsoft.com/<WorkspaceId>/<LakehouseId>", [HierarchicalNavigation=true])
in
Source
table Sales
/// Total revenue
measure 'Total Sales' = ```
SUMX(
Sales,
Sales[Quantity] * Sales[UnitPrice]
)
```
formatString: \$#,##0.00
column SalesKey
dataType: int64
isHidden
isKey
summarizeBy: none
sourceColumn: sales_key
column Quantity
dataType: int64
sourceColumn: quantity
column UnitPrice
dataType: decimal
summarizeBy: none
sourceColumn: unit_price
partition Sales = entity
mode: directLake
source
entityName: Sales
schemaName: dbo
expressionSource: DL_Lakehouse
Full lifecycle: author TMDL → base64-encode → construct payload → POST → poll LRO.
Per COMMON-CLI.md § Item CRUD Operations and ITEM-DEFINITIONS-CORE.md § Definition Envelope:
WS_ID="<workspaceId>"
# 1. Base64-encode each TMDL file
PBISM=$(base64 -w 0 < definition.pbism)
DB=$(base64 -w 0 < definition/database.tmdl)
MODEL=$(base64 -w 0 < definition/model.tmdl)
TABLE=$(base64 -w 0 < definition/tables/Customer.tmdl)
# 2. Construct payload and create — use --verbose to capture HTTP status and LRO headers
cat > /tmp/body.json << EOF
{
"displayName": "MySalesModel",
"definition": {
"format": "TMDL",
"parts": [
{"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"},
{"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"},
{"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"},
{"path": "definition/tables/Customer.tmdl", "payload": "$TABLE", "payloadType": "InlineBase64"}
]
}
}
EOF
az rest --method post --verbose \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
PowerShell — use
[Convert]::ToBase64String([System.IO.File]::ReadAllBytes("file"))instead ofbase64 -w 0.
If the response is 202 Accepted, poll using the LRO pattern from COMMON-CLI.md § Long-Running Operations.
Retrieve TMDL definition for backup, migration, or inspection. getDefinition is a POST (not GET).
WS_ID="<workspaceId>"
MODEL_ID="<semanticModelId>"
# 1. Request definition — may return 200 (inline) or 202 (LRO)
RESPONSE=$(az rest --method post --verbose \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/getDefinition?format=TMDL" \
--body '{}' \
--output json 2>/dev/null)
# 2. If 202, poll the Location header URL until Succeeded, then GET /result
# 3. Decode each part
echo "$RESPONSE" | jq -r '.definition.parts[] | .path + " " + .payload' | \
while read -r path payload; do
mkdir -p "$(dirname "$path")"
echo "$payload" | base64 -d > "$path"
done
Critical rules: Must include ALL parts (modified + unmodified). Never include
.platform. The API replaces the entire definition — omitted parts are deleted.
WS_ID="<workspaceId>"
MODEL_ID="<semanticModelId>"
# 1. Get current definition (see Get/Download Definition above)
# 2. Modify the relevant TMDL files
# 3. Re-encode ALL parts and POST
cat > /tmp/body.json << EOF
{
"definition": {
"format": "TMDL",
"parts": [
{"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"},
{"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"},
{"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"},
{"path": "definition/tables/Customer.tmdl", "payload": "$TABLE", "payloadType": "InlineBase64"}
]
}
}
EOF
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/updateDefinition" \
--body @/tmp/body.json
Use ?updateMetadata=true query parameter only when the .platform file must be included to update display name or description via definition.
| Operation | Supported | Method |
|---|---|---|
| Create semantic model with TMDL | ✅ | POST /v1/workspaces/{id}/semanticModels with definition |
| Get/download TMDL definition | ✅ | POST .../semanticModels/{id}/getDefinition?format=TMDL |
| Update full TMDL definition | ✅ | POST .../semanticModels/{id}/updateDefinition |
| Delete semantic model | ✅ | DELETE /v1/workspaces/{id}/semanticModels/{id} |
| Refresh dataset | ✅ | Power BI Datasets API (Phase 4) |
| Add/modify single measure or column | ⚠️ Route to powerbi-modeling-mcp | Full definition round-trip is inefficient |
| Create reports | ❌ | Not in scope — separate definition format (PBIR) |
All refresh operations use the Power BI Datasets API audience (https://analysis.windows.net/powerbi/api).
WS_ID="<workspaceId>"
DATASET_ID="<semanticModelId>"
PBI="https://api.powerbi.com/v1.0/myorg"
# Trigger full refresh
cat > /tmp/body.json << 'EOF'
{"notifyOption": "NoNotification"}
EOF
az rest --method post --verbose \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Get refresh history (latest first)
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes?\$top=5"
# Cancel an in-progress refresh
az rest --method delete \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes/<refreshId>"
# Get refresh schedule
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshSchedule"
# Update refresh schedule
cat > /tmp/body.json << 'EOF'
{
"value": {
"enabled": true,
"days": ["Monday", "Wednesday", "Friday"],
"times": ["02:00", "14:00"],
"localTimeZoneId": "UTC"
}
}
EOF
az rest --method patch \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshSchedule" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Get data sources for a dataset
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/datasources"
# Get parameters
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/parameters"
# Update parameters
cat > /tmp/body.json << 'EOF'
{
"updateDetails": [
{"name": "Server", "newValue": "newserver.database.windows.net"},
{"name": "Database", "newValue": "ProductionDB"}
]
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/Default.UpdateParameters" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
After updating parameters or data source credentials, trigger a refresh for changes to take effect.
# List dataset users
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users"
# Grant dataset permissions to a user
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "Read"
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Update existing user permissions
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "ReadReshare"
}
EOF
az rest --method put \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
Permission levels: Read, ReadReshare, ReadExplore, ReadReshareExplore.
After defining RLS/OLS roles in TMDL (see Security Roles), use the Power BI Datasets API to assign users and groups to those roles.
PBI="https://api.powerbi.com/v1.0/myorg"
# List members of a security role
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
| jq '[.value[] | select(.datasetUserAccessRight == "Read" and .roles != null)]'
# Add a user to a security role
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "Read",
"roles": ["SalesRegion"]
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Add a security group to a role
cat > /tmp/body.json << 'EOF'
{
"identifier": "<group-object-id>",
"principalType": "Group",
"datasetUserAccessRight": "Read",
"roles": ["SalesRegion"]
}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
# Update role membership (e.g., move user to a different role)
cat > /tmp/body.json << 'EOF'
{
"identifier": "user@contoso.com",
"principalType": "User",
"datasetUserAccessRight": "Read",
"roles": ["EuropeOnly"]
}
EOF
az rest --method put \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/users" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
The
rolesarray accepts one or more role names that must match roles defined in the semantic model's TMDL. The user/group must also have at leastReadpermission on the dataset.principalTypecan beUser,Group, orApp.
Deployment pipelines use the Fabric API audience (https://api.fabric.microsoft.com).
FABRIC="https://api.fabric.microsoft.com/v1"
# List deployment pipelines
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines"
# Get pipeline stages
az rest --method get \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines/<pipelineId>/stages"
# Deploy from one stage to the next (e.g., Dev → Test)
cat > /tmp/body.json << 'EOF'
{
"sourceStageOrder": 0,
"targetStageOrder": 1,
"items": [
{
"sourceItemId": "<semanticModelId>",
"itemType": "SemanticModel"
}
],
"options": {
"allowCreateArtifact": true,
"allowOverwriteArtifact": true
}
}
EOF
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines/<pipelineId>/deploy" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
Omit the
itemsarray to deploy all items in the stage. The deploy call returns202 Accepted— poll using the LRO pattern.
powerbi-modeling-mcp available → use MCP tools for fine-grained object changes (measures, columns, relationships)az rest updateDefinitionGET /v1/workspaces/{id}/semanticModels to find existing models or confirm name availabilitydefinition.pbism, database.tmdl, model.tmdl, and table files per Minimal TMDL Content Examples and tmdl-authoring-guide.mdformatString for all aggregatable valuesEVALUATE { [Measure Name] } for each measure via DAXdataType on both sidessourceColumn mappings and dataType match source schemaEarly-abort rule: If both
getDefinitionreturns404 EntityNotFound(on an item you can list/GET) and the Power BI refresh API returns403 Forbiddenwith"identity None", stop retrying immediately — the user almost certainly has only Viewer role on the workspace. Verify by callingGET /v1/workspaces/{id}/roleAssignments; if that also returns403 InsufficientWorkspaceRole, confirm to the user they need Contributor or higher role. Do not retry with different URL formats, endpoints, or parameters — the issue is permissions, not API usage.
| Symptom | Cause | Fix |
|---|---|---|
403 Forbidden with "identity None" on Power BI API | User has Viewer role — refresh, data sources, and permissions APIs require Contributor+ | Stop immediately. Ask user to request Contributor/Member/Admin role on the workspace |
404 EntityNotFound on getDefinition but item exists in list | Insufficient permissions masquerading as 404 — getDefinition requires Contributor+ | Check workspace role first; do not retry with different URL formats |
403 InsufficientWorkspaceRole on roleAssignments | User is Viewer on the workspace | Confirms Viewer role — all authoring and most read operations are blocked |
401 Unauthorized on Fabric API | Wrong or missing --resource | Use --resource "https://api.fabric.microsoft.com" |
401 Unauthorized on Power BI API | Wrong audience | Use --resource "https://analysis.windows.net/powerbi/api" |
411 Length Required on getDefinition | Missing request body | Pass --body '{}' — getDefinition is a POST |
| LRO poll never completes | Token expired during long operation | Re-acquire token in poll loop; increase Retry-After interval |
202 Accepted but no result | Didn't follow LRO to completion | Poll Location header URL until Succeeded, then GET /result |
| TMDL validation error on create/update | Syntax error in TMDL content | Check TMDL rules in tmdl-authoring-guide.md; validate before encoding |
| Parts missing after updateDefinition | Only modified parts were sent | Must include ALL parts (modified + unmodified) in every update |
Error including .platform in update | .platform not accepted by default | Remove .platform from parts, or use ?updateMetadata=true |
| Base64 decode produces garbled content | Wrong encoding or line wrapping | Use base64 -w 0 (no line wrap) or [Convert]::ToBase64String() |
| Refresh fails with data source error | Credentials expired or parameters wrong | Check data sources and parameters; update credentials if needed |
| Deployment pipeline fails | Workspace not assigned to stage | Assign workspace to pipeline stage before deploying |
lineageTag conflict on new objects | Manually added lineageTag | Remove lineageTag from new objects — it is auto-generated |
| DAX error testing measures | Measure name case mismatch | DAX measure names are case-sensitive; match exactly |
Attempting INFO.ROLES() / INFO.ROLEMEMBERSHIPS() via DAX to retrieve role members | DAX INFO functions do not reliably return role membership data and may return empty or incomplete results | Use the Power BI REST API instead: GET /v1.0/myorg/groups/{workspaceId}/datasets/{datasetId}/users and filter by roles field (see Security Role Memberships) |
WS_ID="<workspaceId>"
# Encode all TMDL files
PBISM=$(base64 -w 0 < definition.pbism)
DB=$(base64 -w 0 < definition/database.tmdl)
MODEL=$(base64 -w 0 < definition/model.tmdl)
CUSTOMER=$(base64 -w 0 < definition/tables/Customer.tmdl)
SALES=$(base64 -w 0 < definition/tables/Sales.tmdl)
cat > /tmp/body.json << EOF
{
"displayName": "SalesModel",
"definition": {
"parts": [
{"path": "definition.pbism", "payload": "$PBISM", "payloadType": "InlineBase64"},
{"path": "definition/database.tmdl", "payload": "$DB", "payloadType": "InlineBase64"},
{"path": "definition/model.tmdl", "payload": "$MODEL", "payloadType": "InlineBase64"},
{"path": "definition/tables/Customer.tmdl", "payload": "$CUSTOMER", "payloadType": "InlineBase64"},
{"path": "definition/tables/Sales.tmdl", "payload": "$SALES", "payloadType": "InlineBase64"}
]
}
}
EOF
az rest --method post --verbose \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels" \
--headers "Content-Type=application/json" \
--body @/tmp/body.json
WS_ID="<workspaceId>"
MODEL_ID="<semanticModelId>"
# Get definition (may return 202 — follow LRO)
RESULT=$(az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "https://api.fabric.microsoft.com/v1/workspaces/$WS_ID/semanticModels/$MODEL_ID/getDefinition?format=TMDL" \
--body '{}' --output json)
# Decode and save all parts
echo "$RESULT" | jq -r '.definition.parts[] | .path + "\t" + .payload' | \
while IFS=$'\t' read -r path payload; do
mkdir -p "$(dirname "$path")"
echo "$payload" | base64 -d > "$path"
echo "Saved: $path"
done
WS_ID="<workspaceId>"
DATASET_ID="<semanticModelId>"
PBI="https://api.powerbi.com/v1.0/myorg"
# Trigger refresh
cat > /tmp/body.json << 'EOF'
{"type": "Full"}
EOF
az rest --method post \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes" \
--body @/tmp/body.json
# Check latest refresh status
az rest --method get \
--resource "https://analysis.windows.net/powerbi/api" \
--url "$PBI/groups/$WS_ID/datasets/$DATASET_ID/refreshes?\$top=1"
FABRIC="https://api.fabric.microsoft.com/v1"
PIPELINE_ID="<pipelineId>"
# Deploy from Test (stage 1) to Production (stage 2)
cat > /tmp/body.json << 'EOF'
{
"sourceStageOrder": 1,
"targetStageOrder": 2,
"items": [
{"sourceItemId": "<semanticModelId>", "itemType": "SemanticModel"}
],
"options": {
"allowCreateArtifact": true,
"allowOverwriteArtifact": true
}
}
EOF
az rest --method post \
--resource "https://api.fabric.microsoft.com" \
--url "$FABRIC/deploymentPipelines/$PIPELINE_ID/deploy" \
--body @/tmp/body.json