Microsoft Fabric Lakehouse, OneLake, and Fabric Warehouse connectors for Azure Data Factory (2025)
/plugin marketplace add JosiahSiegel/claude-code-marketplace/plugin install adf-master@claude-plugin-marketplaceThis skill inherits all available tools. When active, it can use any tool Claude has access to.
MANDATORY: Always Use Backslashes on Windows for File Paths
When using Edit or Write tools on Windows, you MUST use backslashes (\) in file paths, NOT forward slashes (/).
Examples:
D:/repos/project/file.tsxD:\repos\project\file.tsxThis applies to:
NEVER create new documentation files unless explicitly requested by the user.
Microsoft Fabric represents a unified SaaS analytics platform that combines Power BI, Azure Synapse Analytics, and Azure Data Factory capabilities. Azure Data Factory now provides native connectors for Fabric Lakehouse and Fabric Warehouse, enabling seamless data movement between ADF and Fabric workspaces.
The Fabric Lakehouse connector enables both read and write operations to Microsoft Fabric Lakehouse for tables and files.
Using Service Principal Authentication (Recommended):
{
"name": "FabricLakehouseLinkedService",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Lakehouse",
"typeProperties": {
"workspaceId": "12345678-1234-1234-1234-123456789abc",
"artifactId": "87654321-4321-4321-4321-cba987654321",
"servicePrincipalId": "<app-registration-client-id>",
"servicePrincipalKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "AzureKeyVault",
"type": "LinkedServiceReference"
},
"secretName": "fabric-service-principal-key"
},
"tenant": "<tenant-id>"
}
}
}
Using Managed Identity Authentication (Preferred 2025):
{
"name": "FabricLakehouseLinkedService_ManagedIdentity",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Lakehouse",
"typeProperties": {
"workspaceId": "12345678-1234-1234-1234-123456789abc",
"artifactId": "87654321-4321-4321-4321-cba987654321"
// Managed identity used automatically - no credentials needed!
}
}
}
Finding Workspace and Artifact IDs:
https://app.powerbi.com/groups/<workspaceId>/...For Lakehouse Files:
{
"name": "FabricLakehouseFiles",
"properties": {
"type": "LakehouseTable",
"linkedServiceName": {
"referenceName": "FabricLakehouseLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"table": "Files/raw/sales/2025"
}
}
}
For Lakehouse Tables:
{
"name": "FabricLakehouseTables",
"properties": {
"type": "LakehouseTable",
"linkedServiceName": {
"referenceName": "FabricLakehouseLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"table": "SalesData" // Table name in Lakehouse
}
}
}
Copy from Azure SQL to Fabric Lakehouse:
{
"name": "CopyToFabricLakehouse",
"type": "Copy",
"inputs": [
{
"referenceName": "AzureSqlSource",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "FabricLakehouseTables",
"type": "DatasetReference",
"parameters": {
"tableName": "DimCustomer"
}
}
],
"typeProperties": {
"source": {
"type": "AzureSqlSource",
"sqlReaderQuery": "SELECT * FROM dbo.Customers WHERE ModifiedDate > '@{pipeline().parameters.LastRunTime}'"
},
"sink": {
"type": "LakehouseTableSink",
"tableActionOption": "append" // or "overwrite"
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": { "name": "CustomerID" },
"sink": { "name": "customer_id", "type": "Int32" }
},
{
"source": { "name": "CustomerName" },
"sink": { "name": "customer_name", "type": "String" }
}
]
}
}
}
Copy Parquet Files to Fabric Lakehouse:
{
"name": "CopyParquetToLakehouse",
"type": "Copy",
"inputs": [
{
"referenceName": "AzureBlobParquetFiles",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "FabricLakehouseFiles",
"type": "DatasetReference"
}
],
"typeProperties": {
"source": {
"type": "ParquetSource",
"storeSettings": {
"type": "AzureBlobStorageReadSettings",
"recursive": true,
"wildcardFolderPath": "raw/sales/2025",
"wildcardFileName": "*.parquet"
}
},
"sink": {
"type": "LakehouseFileSink",
"storeSettings": {
"type": "LakehouseWriteSettings",
"copyBehavior": "PreserveHierarchy"
}
}
}
}
{
"name": "LookupFabricLakehouseTable",
"type": "Lookup",
"typeProperties": {
"source": {
"type": "LakehouseTableSource",
"query": "SELECT MAX(LastUpdated) as MaxDate FROM SalesData"
},
"dataset": {
"referenceName": "FabricLakehouseTables",
"type": "DatasetReference"
}
}
}
The Fabric Warehouse connector provides T-SQL based data warehousing capabilities within the Fabric ecosystem.
Using Service Principal:
{
"name": "FabricWarehouseLinkedService",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Warehouse",
"typeProperties": {
"endpoint": "myworkspace.datawarehouse.fabric.microsoft.com",
"warehouse": "MyWarehouse",
"authenticationType": "ServicePrincipal",
"servicePrincipalId": "<app-registration-id>",
"servicePrincipalKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "AzureKeyVault",
"type": "LinkedServiceReference"
},
"secretName": "fabric-warehouse-sp-key"
},
"tenant": "<tenant-id>"
}
}
}
Using System-Assigned Managed Identity (Recommended):
{
"name": "FabricWarehouseLinkedService_SystemMI",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Warehouse",
"typeProperties": {
"endpoint": "myworkspace.datawarehouse.fabric.microsoft.com",
"warehouse": "MyWarehouse",
"authenticationType": "SystemAssignedManagedIdentity"
}
}
}
Using User-Assigned Managed Identity:
{
"name": "FabricWarehouseLinkedService_UserMI",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "Warehouse",
"typeProperties": {
"endpoint": "myworkspace.datawarehouse.fabric.microsoft.com",
"warehouse": "MyWarehouse",
"authenticationType": "UserAssignedManagedIdentity",
"credential": {
"referenceName": "UserAssignedManagedIdentityCredential",
"type": "CredentialReference"
}
}
}
}
Bulk Insert Pattern:
{
"name": "CopyToFabricWarehouse",
"type": "Copy",
"inputs": [
{
"referenceName": "AzureSqlSource",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "FabricWarehouseSink",
"type": "DatasetReference"
}
],
"typeProperties": {
"source": {
"type": "AzureSqlSource",
"sqlReaderQuery": "SELECT * FROM dbo.FactSales WHERE OrderDate >= '@{pipeline().parameters.StartDate}'"
},
"sink": {
"type": "WarehouseSink",
"preCopyScript": "TRUNCATE TABLE staging.FactSales",
"writeBehavior": "insert",
"writeBatchSize": 10000,
"tableOption": "autoCreate", // Auto-create table if doesn't exist
"disableMetricsCollection": false
},
"enableStaging": true,
"stagingSettings": {
"linkedServiceName": {
"referenceName": "AzureBlobStorage",
"type": "LinkedServiceReference"
},
"path": "staging/fabric-warehouse",
"enableCompression": true
},
"parallelCopies": 4,
"dataIntegrationUnits": 8
}
}
Upsert Pattern:
{
"sink": {
"type": "WarehouseSink",
"writeBehavior": "upsert",
"upsertSettings": {
"useTempDB": true,
"keys": ["customer_id"],
"interimSchemaName": "staging"
},
"writeBatchSize": 10000
}
}
{
"name": "ExecuteFabricWarehouseStoredProcedure",
"type": "SqlServerStoredProcedure",
"linkedServiceName": {
"referenceName": "FabricWarehouseLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"storedProcedureName": "dbo.usp_ProcessSalesData",
"storedProcedureParameters": {
"StartDate": {
"value": "@pipeline().parameters.StartDate",
"type": "DateTime"
},
"EndDate": {
"value": "@pipeline().parameters.EndDate",
"type": "DateTime"
}
}
}
}
{
"name": "ExecuteFabricWarehouseScript",
"type": "Script",
"linkedServiceName": {
"referenceName": "FabricWarehouseLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"scripts": [
{
"type": "Query",
"text": "DELETE FROM staging.FactSales WHERE LoadDate < DATEADD(day, -30, GETDATE())"
},
{
"type": "Query",
"text": "UPDATE dbo.FactSales SET ProcessedFlag = 1 WHERE OrderDate = '@{pipeline().parameters.ProcessDate}'"
}
],
"scriptBlockExecutionTimeout": "02:00:00"
}
}
Concept: Use OneLake shortcuts instead of copying data
OneLake shortcuts allow you to reference data in Azure Data Lake Gen2 without physically copying it:
Benefits:
ADF Pipeline Pattern:
{
"name": "PL_Process_Shortcut_Data",
"activities": [
{
"name": "TransformShortcutData",
"type": "ExecuteDataFlow",
"typeProperties": {
"dataFlow": {
"referenceName": "DF_Transform",
"type": "DataFlowReference"
},
"compute": {
"coreCount": 8,
"computeType": "General"
}
}
},
{
"name": "WriteToCuratedZone",
"type": "Copy",
"typeProperties": {
"source": {
"type": "ParquetSource"
},
"sink": {
"type": "LakehouseTableSink",
"tableActionOption": "overwrite"
}
}
}
]
}
{
"name": "PL_Incremental_Load_To_Fabric",
"activities": [
{
"name": "GetLastWatermark",
"type": "Lookup",
"typeProperties": {
"source": {
"type": "LakehouseTableSource",
"query": "SELECT MAX(LoadTimestamp) as LastLoad FROM ControlTable"
}
}
},
{
"name": "CopyIncrementalData",
"type": "Copy",
"dependsOn": [
{
"activity": "GetLastWatermark",
"dependencyConditions": ["Succeeded"]
}
],
"typeProperties": {
"source": {
"type": "AzureSqlSource",
"sqlReaderQuery": "SELECT * FROM dbo.Orders WHERE ModifiedDate > '@{activity('GetLastWatermark').output.firstRow.LastLoad}'"
},
"sink": {
"type": "LakehouseTableSink",
"tableActionOption": "append"
}
}
},
{
"name": "UpdateWatermark",
"type": "Script",
"dependsOn": [
{
"activity": "CopyIncrementalData",
"dependencyConditions": ["Succeeded"]
}
],
"linkedServiceName": {
"referenceName": "FabricLakehouseLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"scripts": [
{
"type": "Query",
"text": "INSERT INTO ControlTable VALUES ('@{utcnow()}')"
}
]
}
}
]
}
NEW 2025: Invoke Pipeline Activity for Cross-Platform Calls
{
"name": "PL_ADF_Orchestrates_Fabric_Pipeline",
"activities": [
{
"name": "PrepareDataInADF",
"type": "Copy",
"typeProperties": {
"source": {
"type": "AzureSqlSource"
},
"sink": {
"type": "LakehouseTableSink"
}
}
},
{
"name": "InvokeFabricPipeline",
"type": "InvokePipeline",
"dependsOn": [
{
"activity": "PrepareDataInADF",
"dependencyConditions": ["Succeeded"]
}
],
"typeProperties": {
"workspaceId": "12345678-1234-1234-1234-123456789abc",
"pipelineId": "87654321-4321-4321-4321-cba987654321",
"waitOnCompletion": true,
"parameters": {
"processDate": "@pipeline().parameters.RunDate",
"environment": "production"
}
}
}
]
}
For Fabric Lakehouse:
For Fabric Warehouse:
CREATE USER [your-adf-name] FROM EXTERNAL PROVIDER;
ALTER ROLE db_datareader ADD MEMBER [your-adf-name];
ALTER ROLE db_datawriter ADD MEMBER [your-adf-name];
App Registration Setup:
{
"enableStaging": true,
"stagingSettings": {
"linkedServiceName": {
"referenceName": "AzureBlobStorage",
"type": "LinkedServiceReference"
},
"path": "staging/fabric-loads",
"enableCompression": true
}
}
When to Stage:
Instead of:
ADLS Gen2 → [Copy Activity] → Fabric Lakehouse
Use:
ADLS Gen2 → [OneLake Shortcut] → Direct Access in Fabric
Benefits:
Fabric uses capacity-based pricing. Monitor:
{
"sink": {
"type": "WarehouseSink",
"tableOption": "autoCreate" // Creates table if missing
}
}
Benefits:
{
"activities": [
{
"name": "CopyToFabric",
"type": "Copy",
"policy": {
"retry": 2,
"retryIntervalInSeconds": 30,
"timeout": "0.12:00:00"
}
},
{
"name": "LogFailure",
"type": "WebActivity",
"dependsOn": [
{
"activity": "CopyToFabric",
"dependencyConditions": ["Failed"]
}
],
"typeProperties": {
"url": "@pipeline().parameters.LoggingEndpoint",
"method": "POST",
"body": {
"error": "@activity('CopyToFabric').error.message",
"pipeline": "@pipeline().Pipeline"
}
}
}
]
}
Error: "User does not have permission to access Fabric workspace"
Solution:
Error: "Unable to connect to endpoint"
Solution:
workspaceId and artifactId are correctError: "Column types do not match"
Solution:
tableOption: "autoCreate" for initial loadSymptoms: Slow copy performance to Fabric
Solutions:
parallelCopies (try 4-8)dataIntegrationUnits (8-32)This comprehensive guide enables seamless integration between Azure Data Factory and Microsoft Fabric's modern data platform capabilities.
This skill should be used when the user asks to "create an agent", "add an agent", "write a subagent", "agent frontmatter", "when to use description", "agent examples", "agent tools", "agent colors", "autonomous agent", or needs guidance on agent structure, system prompts, triggering conditions, or agent development best practices for Claude Code plugins.
This skill should be used when the user asks to "create a slash command", "add a command", "write a custom command", "define command arguments", "use command frontmatter", "organize commands", "create command with file references", "interactive command", "use AskUserQuestion in command", or needs guidance on slash command structure, YAML frontmatter fields, dynamic arguments, bash execution in commands, user interaction patterns, or command development best practices for Claude Code.
This skill should be used when the user asks to "create a hook", "add a PreToolUse/PostToolUse/Stop hook", "validate tool use", "implement prompt-based hooks", "use ${CLAUDE_PLUGIN_ROOT}", "set up event-driven automation", "block dangerous commands", or mentions hook events (PreToolUse, PostToolUse, Stop, SubagentStop, SessionStart, SessionEnd, UserPromptSubmit, PreCompact, Notification). Provides comprehensive guidance for creating and implementing Claude Code plugin hooks with focus on advanced prompt-based hooks API.