Use Microsoft Fabric CLI (fab) to manage workspaces, semantic models, reports, notebooks, lakehouses, and other Fabric resources via file-system metaphor and commands. Use when deploying Fabric items, running jobs, querying data, managing OneLake files, or automating Fabric operations. Invoke this skill automatically whenever a user mentions the Fabric CLI, fab, or Fabric.
/plugin marketplace add data-goblin/fabric-cli-plugin/plugin install fabric-cli-plugin@fabric-cli-pluginThis skill inherits all available tools. When active, it can use any tool Claude has access to.
agent-templates/CLAUDE.mdreferences/admin.mdreferences/create-workspaces.mdreferences/fab-api.mdreferences/notebooks.mdreferences/querying-data.mdreferences/quickstart.mdreferences/reference.mdreferences/reports.mdreferences/semantic-models.mdreferences/workspaces.mdscripts/README.mdscripts/create_direct_lake_model.pyscripts/download_workspace.pyscripts/execute_dax.pyscripts/export_semantic_model_as_pbip.pyscripts/search_across_workspaces.pyservers/win32/AnalysisServices.AppSettings.jsonservers/win32/LICENSEservers/win32/Resources/calendar_instructions_and_examples.mdNote: If you have access to a Bash tool (e.g., Claude Code), execute
fabcommands directly via Bash rather than using an MCP server.
Expert guidance for using the fab CLI to programmatically manage Fabric
Activate automatically when tasks involve:
fab, or fab commandsfab ls or fab exists before proceedingfab run fab auth status to make sure the user is authenticated. If not, ask the user to run fab auth login to loginfab --help and fab <command> --help the first time you use a command to understand its syntax, firstfab command alone, first before piping it-f when executing command if the flag is available to do so non-interactivelyfab in non-interactive mode. Interactive mode doesn't work with coding agentsfab auth login # Authenticate (opens browser)
fab auth status # Verify authentication
fab ls # List your workspaces
fab ls "Name.Workspace" # List items in a workspace
Most workflows need IDs. Extract them like this:
WS_ID=$(fab get "ws.Workspace" -q "id" | tr -d '"')
MODEL_ID=$(fab get "ws.Workspace/Model.SemanticModel" -q "id" | tr -d '"')
# Then use in API calls
fab api -A powerbi "groups/$WS_ID/datasets/$MODEL_ID/refreshes" -X post -i '{"type":"Full"}'
New to Fabric CLI? Here are some references you can read:
| Command | Purpose | Example |
|---|---|---|
| Finding Items | ||
fab ls | List items | fab ls "Sales.Workspace" |
fab ls -l | List with details | fab ls "Sales.Workspace" -l |
fab exists | Check if exists | fab exists "Sales.Workspace/Model.SemanticModel" |
fab get | Get item details | fab get "Sales.Workspace/Model.SemanticModel" |
fab get -q | Query specific field | fab get "Sales.Workspace" -q "id" |
| Definitions | ||
fab get -q "definition" | Get full definition | fab get "ws.Workspace/Model.SemanticModel" -q "definition" |
fab export | Export to local | fab export "ws.Workspace/Nb.Notebook" -o ./backup |
fab import | Import from local | fab import "ws.Workspace/Nb.Notebook" -i ./backup/Nb.Notebook |
| Running Jobs | ||
fab job run | Run synchronously | fab job run "ws.Workspace/ETL.Notebook" |
fab job start | Run asynchronously | fab job start "ws.Workspace/ETL.Notebook" |
fab job run -P | Run with params | fab job run "ws.Workspace/Nb.Notebook" -P date:string=2025-01-01 |
fab job run-list | List executions | fab job run-list "ws.Workspace/Nb.Notebook" |
fab job run-status | Check status | fab job run-status "ws.Workspace/Nb.Notebook" --id <job-id> |
| Refreshing Models | ||
fab api -A powerbi | Trigger refresh | fab api -A powerbi "groups/<ws-id>/datasets/<model-id>/refreshes" -X post -i '{"type":"Full"}' |
fab api -A powerbi | Check refresh status | fab api -A powerbi "groups/<ws-id>/datasets/<model-id>/refreshes?\$top=1" |
| DAX Queries | ||
fab get -q "definition" | Get model schema first | fab get "ws.Workspace/Model.SemanticModel" -q "definition" |
fab api -A powerbi | Execute DAX | fab api -A powerbi "groups/<ws-id>/datasets/<model-id>/executeQueries" -X post -i '{"queries":[{"query":"EVALUATE..."}]}' |
| Lakehouse | ||
fab ls | Browse files/tables | fab ls "ws.Workspace/LH.Lakehouse/Files" |
fab table schema | Get table schema | fab table schema "ws.Workspace/LH.Lakehouse/Tables/sales" |
fab cp | Upload/download | fab cp ./local.csv "ws.Workspace/LH.Lakehouse/Files/" |
| Management | ||
fab cp | Copy items | fab cp "dev.Workspace/Item.Type" "prod.Workspace" -f |
fab set | Update properties | fab set "ws.Workspace/Item.Type" -q displayName -i "New Name" |
fab rm | Delete item | fab rm "ws.Workspace/Item.Type" -f |
Fabric uses filesystem-like paths with type extensions:
/WorkspaceName.Workspace/ItemName.ItemType
For lakehouses this is extended into files and tables:
/WorkspaceName.Workspace/LakehouseName.Lakehouse/Files/FileName.extension or /WorkspaceName.Workspace/LakehouseName.Lakehouse/Tables/TableName
For Fabric capacities you have to use fab ls .capacities
Examples:
"/Production.Workspace/Sales Model.SemanticModel"/Data.Workspace/MainLH.Lakehouse/Files/data.csv/Data.Workspace/MainLH.Lakehouse/Tables/dbo/customers.Workspace - Workspaces.SemanticModel - Power BI datasets.Report - Power BI reports.Notebook - Fabric notebooks.DataPipeline - Data pipelines.Lakehouse / .Warehouse - Data stores.SparkJobDefinition - Spark jobsFull list: 35+ types. Use fab desc .<ItemType> to explore.
# List resources
fab ls # List workspaces
fab ls "Production.Workspace" # List items in workspace
fab ls "Production.Workspace" -l # Detailed listing
fab ls "Data.Workspace/LH.Lakehouse" # List lakehouse contents
# Check existence
fab exists "Production.Workspace/Sales.SemanticModel"
# Get details
fab get "Production.Workspace/Sales.Report"
fab get "Production.Workspace" -q "id" # Query with JMESPath
# Create workspace after using `fab ls .capacities` to check capacities
fab mkdir "NewWorkspace.Workspace" -P capacityname=MyCapacity
# Create items
fab mkdir "Production.Workspace/NewLakehouse.Lakehouse"
fab mkdir "Production.Workspace/Pipeline.DataPipeline"
# Update properties
fab set "Production.Workspace/Item.Notebook" -q displayName -i "New Name"
fab set "Production.Workspace" -q description -i "Production environment"
# Copy between workspaces
fab cp "Dev.Workspace/Pipeline.DataPipeline" "Production.Workspace"
fab cp "Dev.Workspace/Report.Report" "Production.Workspace/ProdReport.Report"
# Export to local
fab export "Production.Workspace/Model.SemanticModel" -o /tmp/exports
fab export "Production.Workspace" -o /tmp/backup -a # Export all items
# Import from local
fab import "Production.Workspace/Pipeline.DataPipeline" -i /tmp/exports/Pipeline.DataPipeline -f
# IMPORTANT: Use -f flag for non-interactive execution
# Without -f, import/export operations expect an interactive terminal for confirmation
# This will fail in scripts, automation, or when stdin is not a terminal
fab import "ws.Workspace/Item.Type" -i ./Item.Type -f # Required for scripts
Direct REST API access with automatic authentication.
Audiences:
fabric (default) - Fabric REST APIpowerbi - Power BI REST APIstorage - OneLake Storage APIazure - Azure Resource Manager# Fabric API (default)
fab api workspaces
fab api workspaces -q "value[?type=='Workspace']"
fab api "workspaces/<workspace-id>/items"
# Power BI API (for DAX queries, dataset operations)
fab api -A powerbi groups
fab api -A powerbi "datasets/<model-id>/executeQueries" -X post -i '{"queries": [{"query": "EVALUATE VALUES(Date[Year])"}]}'
# POST/PUT/DELETE
fab api -X post "workspaces/<ws-id>/items" -i '{"displayName": "New Item", "type": "Lakehouse"}'
fab api -X put "workspaces/<ws-id>/items/<item-id>" -i /tmp/config.json
fab api -X delete "workspaces/<ws-id>/items/<item-id>"
# OneLake Storage API
fab api -A storage "WorkspaceName.Workspace/LH.Lakehouse/Files" -P resource=filesystem,recursive=false
# Run synchronously (wait for completion)
fab job run "Production.Workspace/ETL.Notebook"
fab job run "Production.Workspace/Pipeline.DataPipeline" --timeout 300
# Run with parameters
fab job run "Production.Workspace/ETL.Notebook" -P date:string=2024-01-01,batch:int=1000,debug:bool=false
# Start asynchronously
fab job start "Production.Workspace/LongProcess.Notebook"
# Monitor
fab job run-list "Production.Workspace/ETL.Notebook"
fab job run-status "Production.Workspace/ETL.Notebook" --id <job-id>
# Schedule
fab job run-sch "Production.Workspace/Pipeline.DataPipeline" --type daily --interval 10:00,16:00 --start 2024-11-15T09:00:00 --enable
fab job run-sch "Production.Workspace/Pipeline.DataPipeline" --type weekly --interval 10:00 --days Monday,Friday --enable
# Cancel
fab job run-cancel "Production.Workspace/ETL.Notebook" --id <job-id>
# View schema
fab table schema "Data.Workspace/LH.Lakehouse/Tables/dbo/customers"
# Load data (non-schema lakehouses only)
fab table load "Data.Workspace/LH.Lakehouse/Tables/sales" --file "Data.Workspace/LH.Lakehouse/Files/daily_sales.csv" --mode append
# Optimize (lakehouses only)
fab table optimize "Data.Workspace/LH.Lakehouse/Tables/transactions" --vorder --zorder customer_id,region
# Vacuum (lakehouses only)
fab table vacuum "Data.Workspace/LH.Lakehouse/Tables/temp_data" --retain_n_hours 48
# Find models
fab ls "ws.Workspace" | grep ".SemanticModel"
# Get definition
fab get "ws.Workspace/Model.SemanticModel" -q definition
# Trigger refresh
fab api -A powerbi "groups/$WS_ID/datasets/$MODEL_ID/refreshes" -X post -i '{"type":"Full"}'
# Check refresh status
fab api -A powerbi "groups/$WS_ID/datasets/$MODEL_ID/refreshes?\$top=1"
Execute DAX:
fab api -A powerbi "groups/$WS_ID/datasets/$MODEL_ID/executeQueries" -X post \
-i '{"queries":[{"query":"EVALUATE TOPN(5, '\''TableName'\'')"}]}'
DAX rules: EVALUATE required, single quotes around tables ('Sales'), qualify columns ('Sales'[Amount]).
For full details: semantic-models.md | querying-data.md
# Get report definition
fab get "ws.Workspace/Report.Report" -q definition
# Export to local
fab export "ws.Workspace/Report.Report" -o /tmp/exports -f
# Import from local
fab import "ws.Workspace/Report.Report" -i /tmp/exports/Report.Report -f
# Rebind to different model
fab set "ws.Workspace/Report.Report" -q semanticModelId -i "<new-model-id>"
For full details: reports.md
# Browse contents
fab ls "Data.Workspace/LH.Lakehouse/Files"
fab ls "Data.Workspace/LH.Lakehouse/Tables/dbo"
# Upload/download files
fab cp ./local-data.csv "Data.Workspace/LH.Lakehouse/Files/data.csv"
fab cp "Data.Workspace/LH.Lakehouse/Files/data.csv" ~/Downloads/
# Load and optimize tables
fab table load "Data.Workspace/LH.Lakehouse/Tables/sales" --file "Data.Workspace/LH.Lakehouse/Files/sales.csv"
fab table optimize "Data.Workspace/LH.Lakehouse/Tables/sales" --vorder --zorder customer_id
# Export from dev
fab export "Dev.Workspace" -o /tmp/migration -a
# Import to production (item by item)
fab import "Production.Workspace/Pipeline.DataPipeline" -i /tmp/migration/Pipeline.DataPipeline
fab import "Production.Workspace/Report.Report" -i /tmp/migration/Report.Report
Use scripts/search_across_workspaces.py for cross-workspace search with rich metadata not available elsewhere:
# Find all semantic models (use "Model" not "SemanticModel")
python3 scripts/search_across_workspaces.py --type Model
# Find models by name
python3 scripts/search_across_workspaces.py --type Model --filter "Sales"
# Find stale items (not visited in 6+ months)
python3 scripts/search_across_workspaces.py --type Model --not-visited-since 2024-06-01
# Find items by owner
python3 scripts/search_across_workspaces.py --type PowerBIReport --owner "kurt"
# Find Direct Lake models only
python3 scripts/search_across_workspaces.py --type Model --storage-mode directlake
# Find items in workspace
python3 scripts/search_across_workspaces.py --type Lakehouse --workspace "fit-data"
# Get JSON output
python3 scripts/search_across_workspaces.py --type Model --output json
# Sort by last visited (oldest first)
python3 scripts/search_across_workspaces.py --type Model --sort last-visited --sort-order asc
# List all available types
python3 scripts/search_across_workspaces.py --list-types
Unique DataHub fields (not available via fab api or admin APIs):
lastVisitedTimeUTC - When item was last opened/usedstorageMode - Import, DirectQuery, or DirectLakeownerUser - Full owner details (name, email)capacitySku - F2, F64, PP, etc.isDiscoverable - Whether item appears in searchImportant type mappings:
--type Model (not SemanticModel)--type DataFlow (capital F)--type SynapseNotebookIf you have Fabric/Power BI admin access:
# Find semantic models by name (cross-workspace)
fab api "admin/items" -P "type=SemanticModel" -q "itemEntities[?contains(name, 'Sales')]"
# Find all notebooks
fab api "admin/items" -P "type=Notebook" -q "itemEntities[].{name:name,workspace:workspaceId}"
# Find all lakehouses
fab api "admin/items" -P "type=Lakehouse"
# Common types: SemanticModel, Report, Notebook, Lakehouse, Warehouse, DataPipeline, Ontology
For full admin API reference: admin.md
Filter and transform JSON responses with -q:
# Get single field
-q "id"
-q "displayName"
# Get nested field
-q "properties.sqlEndpointProperties"
-q "definition.parts[0]"
# Filter arrays
-q "value[?type=='Lakehouse']"
-q "value[?contains(name, 'prod')]"
# Get first element
-q "value[0]"
-q "definition.parts[?path=='model.tmdl'] | [0]"
# Show response headers
fab api workspaces --show_headers
# Verbose output
fab get "Production.Workspace/Item" -v
# Save responses for debugging
fab api workspaces -o /tmp/workspaces.json
ls for fast listing - Much faster than getexists before operations - Check before get/modify-q - Get only what you need-f, --force - Skip confirmation prompts-v, --verbose - Verbose output-l - Long format listing-a - Show hidden items-o, --output - Output file path-i, --input - Input file or JSON string-q, --query - JMESPath query-P, --params - Parameters (key=value)-X, --method - HTTP method (get/post/put/delete/patch)-A, --audience - API audience (fabric/powerbi/storage/azure)--show_headers - Show response headers--timeout - Timeout in secondsfab is installed and authenticated.Workspace, .SemanticModel, etc.)"My Workspace.Workspace"-f for non-interactive scripts (skips prompts)-A powerbi) for DAX queries and dataset operationsFor specific item type help:
fab desc .<ItemType>
For command help:
fab --help
fab <command> --help
Skill references:
External references (request markdown when possible):
dax.guide/<function>/ e.g. dax.guide/addcolumns/powerquery.guide/function/<function>This skill should be used when the user asks to "create an agent", "add an agent", "write a subagent", "agent frontmatter", "when to use description", "agent examples", "agent tools", "agent colors", "autonomous agent", or needs guidance on agent structure, system prompts, triggering conditions, or agent development best practices for Claude Code plugins.
This skill should be used when the user asks to "create a slash command", "add a command", "write a custom command", "define command arguments", "use command frontmatter", "organize commands", "create command with file references", "interactive command", "use AskUserQuestion in command", or needs guidance on slash command structure, YAML frontmatter fields, dynamic arguments, bash execution in commands, user interaction patterns, or command development best practices for Claude Code.
This skill should be used when the user asks to "create a hook", "add a PreToolUse/PostToolUse/Stop hook", "validate tool use", "implement prompt-based hooks", "use ${CLAUDE_PLUGIN_ROOT}", "set up event-driven automation", "block dangerous commands", or mentions hook events (PreToolUse, PostToolUse, Stop, SubagentStop, SessionStart, SessionEnd, UserPromptSubmit, PreCompact, Notification). Provides comprehensive guidance for creating and implementing Claude Code plugin hooks with focus on advanced prompt-based hooks API.