From pbi-desktop
Connects to Power BI Desktop's local Analysis Services instance via PowerShell using TOM and ADOMD.NET for model enumeration, DAX queries, metadata changes, and query tracing.
npx claudepluginhub data-goblin/power-bi-agentic-development --plugin pbi-desktopThis skill uses the workspace's default tool permissions.
> **CRITICAL:** Record mistakes, surprises, and model-specific nuances encountered while using this skill in `.claude/rules/connect-pbid.md`. This file must begin with "Learnings from Claude about connecting to semantic models via the connect-pbid skill". Write only active reference notes (e.g. "QueryGroup property returns an object; access .Folder for the name string"); do not log a changelog ...
references/annotations.mdreferences/calendar-column-groups.mdreferences/dax-expressions.mdreferences/dax-pitfalls.mdreferences/export-model.mdreferences/load-tmdl-files.mdreferences/parallels-macos.mdreferences/query-listener.mdreferences/refresh-model.mdreferences/tom-object-types.mdreferences/vertipaq-stats.mdscripts/connect-and-enumerate.ps1scripts/connect-from-mac.shscripts/create-field-parameter.ps1scripts/explore-model.ps1scripts/load-tmdl.ps1scripts/modify-tom-objects.ps1scripts/query-dax.ps1scripts/refresh-table.ps1Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Performs token-optimized structural code search using tree-sitter AST parsing to discover symbols, outline files, and unfold code without reading full files.
CRITICAL: Record mistakes, surprises, and model-specific nuances encountered while using this skill in
.claude/rules/connect-pbid.md. This file must begin with "Learnings from Claude about connecting to semantic models via the connect-pbid skill". Write only active reference notes (e.g. "QueryGroup property returns an object; access .Folder for the name string"); do not log a changelog or history of events. Omit anything already documented in the skill or its references. Keep the file under 1500 characters at all times; prune stale entries when adding new ones. Do not over-attend to this file; update it only when something genuinely unexpected is discovered.
Note: No MCP server required; do not use this skill with MCP servers or CLI tools. Use this skill to execute PowerShell commands directly via Bash to connect to Power BI Desktop's local Analysis Services instance.
Expert guidance for connecting to Power BI Desktop's local tabular model via the Tabular Object Model (TOM) and ADOMD.NET in PowerShell. Covers connection, enumeration, DAX queries, query traces, and full model modification.
Activate only when the Tabular Editor CLI or a Power BI MCP server is unavailable. TOM is more reliable than direct TMDL editing because it validates changes against the engine and applies them atomically.
WARNING: This skill does NOT support remote models via the XMLA endpoint.
localhostmsmdsrv.exe processes on different ports. Connect to each port, read $server.Databases[0].Name, and ask the user which model to work with if more than one is found-Command argument so Bash does not interpret $env:TEMP, $server, etc. as shell variables. Double quotes cause $ variables to be eaten by Bash before PowerShell sees them:
# Wrong -- Bash eats $env:TEMP, PowerShell gets empty string
powershell -Command "$pkgDir = $env:TEMP\tom_nuget"
# Correct -- single quotes pass $env:TEMP literally to PowerShell
powershell -Command '$pkgDir = "$env:TEMP\tom_nuget"'
For complex scripts, write to a .ps1 file and execute with -File instead of -Command to avoid escaping issues entirely..ps1 files. Only create script files for repeated operations. For one-off queries or modifications, use powershell -ExecutionPolicy Bypass -Command '...' directly.-ExecutionPolicy Bypass when running PowerShell commands or scripts. Windows blocks unsigned scripts by default..claude/scripts/, .github/scripts/, .cursor/scripts/, .gemini/scripts/, etc. depending on the harness). Ephemeral or throwaway scripts should go in a project tmp/ directory (which should be .gitignored). Do not write scripts to ./ root or /tmp/.$model.SaveChanges() to persist modifications; without it, changes are discarded| Requirement | Description |
|---|---|
| Power BI Desktop | Open with a model loaded (.pbix or .pbip) |
| PowerShell | Available on the machine running PBI Desktop |
| NuGet CLI | For package installation (winget install Microsoft.NuGet) |
| TOM NuGet Package | Microsoft.AnalysisServices.retail.amd64 -- model metadata |
| ADOMD.NET Package | Microsoft.AnalysisServices.AdomdClient.retail.amd64 -- DAX queries |
Install both packages only if not already present:
$pkgDir = "$env:TEMP\tom_nuget"
if (-not (Test-Path "$pkgDir\Microsoft.AnalysisServices.retail.amd64")) {
nuget install Microsoft.AnalysisServices.retail.amd64 -OutputDirectory $pkgDir -ExcludeVersion
}
if (-not (Test-Path "$pkgDir\Microsoft.AnalysisServices.AdomdClient.retail.amd64")) {
nuget install Microsoft.AnalysisServices.AdomdClient.retail.amd64 -OutputDirectory $pkgDir -ExcludeVersion
}
Packages install DLLs under lib\net45\. Load with Add-Type -Path.
Find the port, load TOM, connect, enumerate -- in one script:
# Find port
$pids = (Get-Process msmdsrv -ErrorAction SilentlyContinue).Id
$ports = netstat -ano | Select-String "LISTENING" |
Where-Object { $pids -contains ($_ -split "\s+")[-1] } |
ForEach-Object { ($_ -split "\s+")[2] -replace ".*:" }
# Load TOM
$basePath = "$env:TEMP\tom_nuget\Microsoft.AnalysisServices.retail.amd64\lib\net45"
Add-Type -Path "$basePath\Microsoft.AnalysisServices.Core.dll"
Add-Type -Path "$basePath\Microsoft.AnalysisServices.Tabular.dll"
# Connect to first port
$server = New-Object Microsoft.AnalysisServices.Tabular.Server
$server.Connect("Data Source=localhost:$($ports[0])")
$model = $server.Databases[0].Model
# Enumerate
foreach ($table in $model.Tables) {
Write-Output "TABLE: [$($table.Name)] ($($table.Columns.Count) cols, $($table.Measures.Count) measures)"
}
Write-Output "Relationships: $($model.Relationships.Count)"
$server.Disconnect()
Port discovery methods:
| Method | Install Type | Command |
|---|---|---|
| Port file | Non-Store PBI Desktop | Get-Content "$env:LOCALAPPDATA\Microsoft\Power BI Desktop\AnalysisServicesWorkspaces\*\Data\msmdsrv.port.txt" |
| Port file | Store PBI Desktop | Get-Content "$env:LOCALAPPDATA\Packages\Microsoft.MicrosoftPowerBIDesktop_*\LocalState\AnalysisServicesWorkspaces\*\Data\msmdsrv.port.txt" |
| netstat | Any | netstat -ano | findstr LISTENING | findstr <PID> |
$basePath = "$env:TEMP\tom_nuget\Microsoft.AnalysisServices.retail.amd64\lib\net45"
Add-Type -Path "$basePath\Microsoft.AnalysisServices.Core.dll"
Add-Type -Path "$basePath\Microsoft.AnalysisServices.Tabular.dll"
Add-Type -Path "$basePath\Microsoft.AnalysisServices.Tabular.Json.dll"
$server = New-Object Microsoft.AnalysisServices.Tabular.Server
$server.Connect("Data Source=localhost:<PORT>")
# PBI Desktop always has exactly one database
$db = $server.Databases[0]
$model = $db.Model
Only save after all changes are made. After modifications, persist with:
$model.SaveChanges()
Changes appear immediately in PBI Desktop. The user cannot undo with Ctrl+Z in Power BI, which is a disadvantage of this approach.
IMPORTANT: Remember to disconnect after modifications are done. NEVER remain connected, which can lead to orphaned processes.
$server.Disconnect()
Write-Output "Server: $($server.Name)"
Write-Output "Version: $($server.Version)"
Write-Output "Database: $($db.Name)"
Write-Output "Compatibility: $($db.CompatibilityLevel)"
Trigger a data refresh via TMSL (Tabular Model Scripting Language) or TOM's RequestRefresh API. This re-executes Power Query/M expressions and reloads data into the VertiPaq engine.
# Full refresh of a single table via TMSL
$dbName = $server.Databases[0].Name
$tmsl = '{ "refresh": { "type": "full", "objects": [{ "database": "' + $dbName + '", "table": "Sales" }] } }'
$server.Execute($tmsl)
# Or via TOM RequestRefresh API
$model.Tables["Sales"].RequestRefresh([Microsoft.AnalysisServices.Tabular.RefreshType]::Full)
$model.SaveChanges()
| Refresh Type | Behaviour |
|---|---|
full | Drop data, re-query source, recalculate DAX |
calculate | Recalculate DAX only (no source query) |
automatic | Engine decides per-partition what's needed |
dataOnly | Re-query source but skip DAX recalculation |
For detailed examples and all refresh methods, see refresh-model.md.
Add-Type -Path "$env:TEMP\tom_nuget\Microsoft.AnalysisServices.AdomdClient.retail.amd64\lib\net45\Microsoft.AnalysisServices.AdomdClient.dll"
$conn = New-Object Microsoft.AnalysisServices.AdomdClient.AdomdConnection
$conn.ConnectionString = "Data Source=localhost:<PORT>"
$conn.Open()
All queries should preferably use SUMMARIZECOLUMNS.
Check dax.guide online for information about DAX functions, if necessary.
Important: ADOMD.NET returns fully-qualified column names (e.g., 'Monsters'[Name] not Name). Do not access columns by short name ($reader["Name"]) -- it fails silently and returns blank. Use $reader.GetName($i) to discover column names, then access by index:
$cmd = $conn.CreateCommand()
$cmd.CommandText = "EVALUATE SUMMARIZECOLUMNS('Table'[Column], ""@MeasureName"", [Measure])"
$reader = $cmd.ExecuteReader()
# Always iterate by index and use GetName() to map columns
while ($reader.Read()) {
for ($i = 0; $i -lt $reader.FieldCount; $i++) {
Write-Output "$($reader.GetName($i)): $($reader.GetValue($i))"
}
Write-Output "---"
}
$reader.Close()
'Sales'[Amount], not [Amount]. This applies everywhere -- measures, calculated columns, queries. Unqualified columns cause ambiguity errors.'Sales'[Amount], 'D&D 5E Monsters'[CR]. Even simple names like Sales should be quoted as 'Sales' for consistency.[Total Revenue]"" inside PowerShell here-strings# Full table scan
$cmd.CommandText = "EVALUATE 'Sales'"
# Filtered with CALCULATETABLE
$cmd.CommandText = "EVALUATE CALCULATETABLE('Sales', 'Sales'[Region] = ""West"")"
# Aggregation
$cmd.CommandText = "EVALUATE SUMMARIZECOLUMNS('Date'[Year], ""@Total"", SUM('Sales'[Amount]))"
# Scalar via ROW
$cmd.CommandText = "EVALUATE ROW(""Result"", COUNTROWS('Sales'))"
# DMV queries (model metadata via SQL-like syntax)
$cmd.CommandText = "SELECT * FROM `$SYSTEM.TMSCHEMA_TABLES"
$cmd.CommandText = "SELECT * FROM `$SYSTEM.TMSCHEMA_MEASURES"
$cmd.CommandText = "SELECT * FROM `$SYSTEM.TMSCHEMA_COLUMNS"
$cmd.CommandText = "SELECT * FROM `$SYSTEM.TMSCHEMA_RELATIONSHIPS"
$conn.Close()
All modifications require a TOM connection (section 3). Call $model.SaveChanges() after each batch of changes.
For full CRUD examples of every object type, see tom-object-types.md.
Common object types and their TOM collections (not exhaustive -- see Microsoft TOM API docs for the full namespace):
| Object | Collection | Create | Read | Update | Delete |
|---|---|---|---|---|---|
| Table | $model.Tables | New-Object ...Table | $model.Tables["Name"] | Set properties | .Remove($obj) |
| Column | $table.Columns | New-Object ...DataColumn | $table.Columns["Name"] | Set properties | .Remove($obj) |
| Measure | $table.Measures | New-Object ...Measure | $table.Measures["Name"] | Set properties | .Remove($obj) |
| Calculated Column | $table.Columns | New-Object ...CalculatedColumn | Filter by type | Set .Expression | .Remove($obj) |
| Calculated Table | $model.Tables | Table + calculated partition | Check partition type | Set partition expr | .Remove($obj) |
| Relationship | $model.Relationships | New-Object ...SingleColumnRelationship | Index or filter | Set properties | .Remove($obj) |
| Hierarchy | $table.Hierarchies | New-Object ...Hierarchy | $table.Hierarchies["Name"] | Modify levels | .Remove($obj) |
| Role | $model.Roles | New-Object ...ModelRole | $model.Roles["Name"] | Set permissions | .Remove($obj) |
| Perspective | $model.Perspectives | New-Object ...Perspective | $model.Perspectives["Name"] | Toggle membership | .Remove($obj) |
| Culture | $model.Cultures | New-Object ...Culture | $model.Cultures["en-US"] | Set translations | .Remove($obj) |
| Partition | $table.Partitions | New-Object ...Partition | $table.Partitions["Name"] | Set source/expression | .Remove($obj) |
| Annotation | Any object | $obj.Annotations.Add(...) | $obj.Annotations["Key"] | Set .Value | .Remove($obj) |
| Expression | $model.Expressions | New-Object ...NamedExpression | $model.Expressions["Name"] | Set .Expression | .Remove($obj) |
| Data Source | $model.DataSources | New-Object ...StructuredDataSource | $model.DataSources["Name"] | Set connection | .Remove($obj) |
| Calculation Group | $model.Tables | Table with CalculationGroup | Filter by type | Add/remove items | .Remove($obj) |
Quick examples (inline):
# Add a measure
$m = New-Object Microsoft.AnalysisServices.Tabular.Measure
$m.Name = "Total Revenue"
$m.Expression = "SUM(Sales[Amount])"
$m.FormatString = "`$#,0"
$m.Description = "Sum of all sales amounts"
$model.Tables["Sales"].Measures.Add($m)
# Add a relationship
$rel = New-Object Microsoft.AnalysisServices.Tabular.SingleColumnRelationship
$rel.Name = "Sales_to_Date"
$rel.FromColumn = $model.Tables["Sales"].Columns["DateKey"]
$rel.ToColumn = $model.Tables["Date"].Columns["DateKey"]
$rel.FromCardinality = [Microsoft.AnalysisServices.Tabular.RelationshipEndCardinality]::Many
$rel.ToCardinality = [Microsoft.AnalysisServices.Tabular.RelationshipEndCardinality]::One
$model.Relationships.Add($rel)
# Rename a column
$model.Tables["Sales"].Columns["amt"].Name = "Amount"
# Hide a table
$model.Tables["Bridge"].IsHidden = $true
# Delete a measure
$m = $model.Tables["Sales"].Measures["Old Measure"]
$model.Tables["Sales"].Measures.Remove($m)
# Add a role with RLS
$role = New-Object Microsoft.AnalysisServices.Tabular.ModelRole
$role.Name = "Region Filter"
$role.ModelPermission = [Microsoft.AnalysisServices.Tabular.ModelPermission]::Read
$model.Roles.Add($role)
$tp = New-Object Microsoft.AnalysisServices.Tabular.TablePermission
$tp.Table = $model.Tables["Sales"]
$tp.FilterExpression = "[Region] = USERNAME()"
$role.TablePermissions.Add($tp)
$model.SaveChanges()
For complete TOM object type tables, PowerShell reflection patterns for discovering properties and enum values, and reading/setting property examples, see references/tom-object-types.md.
Before saving measure/column expressions, validate them by test-executing against the live model. This catches syntax errors, missing column references, and circular dependencies without persisting bad metadata.
# Validate a DAX expression before adding it as a measure
$testExpr = "SUM('Sales'[Amount]) / COUNTROWS('Sales')"
$cmd = $conn.CreateCommand()
$cmd.CommandText = "EVALUATE ROW(`"@Test`", $testExpr)"
try {
$reader = $cmd.ExecuteReader()
$reader.Close()
Write-Output "VALID"
} catch {
Write-Output "INVALID: $($_.Exception.Message)"
}
For calculated table expressions, wrap in COUNTROWS:
$tableExpr = "CALENDAR(DATE(2020,1,1), DATE(2030,12,31))"
$cmd.CommandText = "EVALUATE ROW(`"@Count`", COUNTROWS($tableExpr))"
For filter expressions (RLS), test with CALCULATETABLE:
$filterExpr = "'Sales'[Region] = `"West`""
$cmd.CommandText = "EVALUATE CALCULATETABLE(ROW(`"@OK`", 1), $filterExpr)"
SaveChanges() applies all pending modifications in a single implicit transaction. If any object fails validation, the entire batch rolls back automatically.
For multi-step workflows where inspection or rollback is needed before committing:
try {
# Make changes (not yet persisted)
$model.Tables["Sales"].Measures["Revenue"].Name = "Total Revenue"
$model.Tables["Sales"].Measures["Cost"].Name = "Total Cost"
# Inspect before committing (changes are local to this connection)
foreach ($m in $model.Tables["Sales"].Measures) {
Write-Output " [$($m.Name)]"
}
# Commit all changes atomically
$model.SaveChanges()
Write-Output "Committed"
} catch {
# Discard all uncommitted changes
$model.UndoLocalChanges()
Write-Output "Rolled back: $($_.Exception.Message)"
}
UndoLocalChanges() discards all modifications made since the last SaveChanges(). This is the rollback mechanism for PBI Desktop; there is no explicit begin/commit transaction API on the local Analysis Services instance.
The TOM API does not expose a public Validate() method. Validation happens implicitly during SaveChanges() (which rolls back the entire batch on failure). For pre-save validation, inspect objects manually:
# Check measures have valid expressions (non-empty)
foreach ($m in ($model.Tables | ForEach-Object { $_.Measures }) ) {
if ([string]::IsNullOrWhiteSpace($m.Expression)) {
Write-Output "WARNING: Measure [$($m.Name)] in [$($m.Table.Name)] has no expression"
}
}
# Check relationships reference valid columns
foreach ($rel in $model.Relationships) {
$sr = [Microsoft.AnalysisServices.Tabular.SingleColumnRelationship]$rel
if ($sr.FromColumn -eq $null -or $sr.ToColumn -eq $null) {
Write-Output "WARNING: Relationship [$($sr.Name)] has null column references"
}
}
# Check for duplicate measure names across tables
$names = @{}
foreach ($m in ($model.Tables | ForEach-Object { $_.Measures })) {
if ($names.ContainsKey($m.Name)) {
Write-Output "WARNING: Duplicate measure name [$($m.Name)] in [$($m.Table.Name)] and [$($names[$m.Name])]"
}
$names[$m.Name] = $m.Table.Name
}
TOM does not expose the .pbix/.pbip file path directly. The most reliable method across all PBI Desktop install types is reading the FileHistory from User.zip in the PBI Desktop app data folder.
Primary method — FileHistory in User.zip (works for Store and non-Store):
# Read the most recently opened file from PBI Desktop's settings
$userZip = "$env:USERPROFILE\Microsoft\Power BI Desktop Store App\User.zip"
# For non-Store installs: "$env:LOCALAPPDATA\Microsoft\Power BI Desktop\User.zip"
Add-Type -Assembly System.IO.Compression.FileSystem
$z = [System.IO.Compression.ZipFile]::OpenRead($userZip)
$entry = $z.Entries | Where-Object { $_.Name -eq 'Settings.xml' }
$reader = New-Object System.IO.StreamReader($entry.Open())
$content = $reader.ReadToEnd()
$reader.Close()
$z.Dispose()
# Extract FileHistory entries (ordered by lastAccessedDate, most recent first)
$history = ($content -split '(?=<Entry)' | Where-Object { $_ -match 'FileHistory' })[0]
$json = [regex]::Match($history, 'Value="s\[(.*?)\]"').Groups[1].Value -replace '"', '"'
$files = $json | ConvertFrom-Json
$files | Select-Object filePath, lastAccessedDate | Format-Table -AutoSize
The first entry is the most recently opened file. Files on the Mac (via Parallels) appear as \\Mac\Home\... paths.
Limitation: This is an imperfect method — it reads recent file history, not the currently open file. If multiple PBI Desktop instances are open, or the most recently accessed file in history isn't the one currently open, the result may be wrong. Confirm with the user if there is any ambiguity.
Fallback — window title (non-Store PBI Desktop only):
Get-Process PBIDesktop -ErrorAction SilentlyContinue | Select-Object Id, MainWindowTitle
Note: Store PBI Desktop (from Microsoft Store / WindowsApps) does not expose the file path in the window title — use the User.zip method above instead.
Fallback — msmdsrv command line (gives workspace path, not file path):
# Useful for finding the port; does NOT reveal the source file path
(Get-WmiObject Win32_Process -Filter "Name='msmdsrv.exe'").CommandLine
For .pbip projects, metadata files are human-readable JSON/TMDL on disk and can be read and modified directly.
Common targets:
| File | Purpose | Skill |
|---|---|---|
<Name>.Report/definition.pbir | Report-to-model connection (byPath or byConnection) | pbip |
<Name>.Report/definition/report.json | Report-level settings, theme, filters | pbir-format |
<Name>.SemanticModel/definition/*.tmdl | Model metadata (tables, measures, relationships) | tmdl |
<Name>.SemanticModel/definition/expressions.tmdl | M/Power Query shared expressions and parameters | tmdl |
For syntax, structure, and editing patterns for these files, load the relevant skill from the pbip plugin:
pbip -- project structure, file types, .pbir connection, forkingpbir-format -- report.json, visual.json, themes, filters, PBIR JSON schemastmdl -- TMDL syntax, measures, columns, roles, relationshipsIMPORTANT: Power BI Desktop does not watch for external file changes. If you edit metadata files on disk while the report is open, the changes will be silently ignored or overwritten when PBI Desktop next saves.
To apply external file edits:
.pbix or .pbip fileThis is different from TOM modifications via $model.SaveChanges(), which apply immediately to the running instance without requiring a restart.
| Topic | URL |
|---|---|
| TOM API Reference | learn.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.tabular |
| TOM Overview | learn.microsoft.com/en-us/analysis-services/tom/introduction-to-the-tabular-object-model-tom-in-analysis-services-amo |
| ADOMD.NET Reference | learn.microsoft.com/en-us/dotnet/api/microsoft.analysisservices.adomdclient |
| Client Libraries | learn.microsoft.com/en-us/analysis-services/client-libraries |
| DMV Reference | learn.microsoft.com/en-us/analysis-services/instances/use-dynamic-management-views-dmvs-to-monitor-analysis-services |
| DAX Reference | dax.guide |
| Compatibility Levels | learn.microsoft.com/en-us/analysis-services/tabular-models/compatibility-level-for-tabular-models-in-analysis-services |
To retrieve current TOM/ADOMD.NET reference docs, use microsoft_docs_search + microsoft_docs_fetch (MCP) if available, otherwise mslearn search + mslearn fetch (CLI). Search based on the user's request and run multiple searches as needed to ensure sufficient context before proceeding.
Skill references:
Agents:
query-listener -- Dispatch to capture live visual DAX queries in real time; polls DISCOVER_SESSIONS and reports query text + timingsExample scripts in scripts/:
connect-and-enumerate.ps1 - Connect to PBI Desktop and list all tables, columns, measures, relationshipsexplore-model.ps1 - Hierarchical metadata enumeration (tables, columns, measures, hierarchies, partitions, relationships, roles, perspectives, cultures, expressions, data sources)query-dax.ps1 - Execute DAX queries via ADOMD.NET with formatted outputrefresh-table.ps1 - Refresh a table or entire model via TMSL with configurable refresh typemodify-tom-objects.ps1 - Create table, rename measures, set folders/formats, hide columns, create relationship (with undo)create-field-parameter.ps1 - Create a field parameter table from a list of measures with all required metadataconnect-from-mac.sh - macOS wrapper that runs PowerShell scripts in a Parallels VM via prlctl execExternal references:
dax.guide/<function>/ for individual function reference