From fabric-skills
Authors, wires, and publishes Microsoft Fabric Eventstream real-time streaming topologies via Fabric Items REST API. Supports 25 sources (Event Hubs, IoT Hub, CDC, Kafka), 8 operators (Filter, Aggregate, Join, SQL), 4 destinations.
npx claudepluginhub microsoft/skills-for-fabric --plugin skills-for-fabricThis skill uses the workspace's default tool permissions.
> **Update Check — ONCE PER SESSION (mandatory)**
Lists, inspects, and monitors Microsoft Fabric Eventstream real-time event ingestion pipelines via Fabric Items REST API. Decodes base64 graph topologies to trace event flow, validates source/destination configs, retention (1-90 days), and throughput.
Manages MongoDB Atlas Stream Processing (ASP) workflows: provisions workspaces, configures connections (Kafka, Atlas clusters, S3, HTTPS, Lambda), operates processors, debugs diagnostics, and sizes tiers for streaming data.
Manages MongoDB Atlas Stream Processing pipelines: provisions workspaces, configures Kafka/Atlas/S3/Lambda connections, operates processors, debugs issues, sizes tiers.
Share bugs, ideas, or general feedback.
Update Check — ONCE PER SESSION (mandatory) The first time this skill is used in a session, run the check-updates skill before proceeding.
- GitHub Copilot CLI / VS Code: invoke the
check-updatesskill.- Claude Code / Cowork / Cursor / Windsurf / Codex: compare local vs remote package.json version.
- Skip if the check was already performed earlier in this session.
CRITICAL NOTES
- To find the workspace details (including its ID) from workspace name: list all workspaces and, then, use JMESPath filtering
- To find the item details (including its ID) from workspace ID, item type, and item name: list all items of that type in that workspace and, then, use JMESPath filtering
- Eventstream ≠ Eventhouse. Eventstream is a real-time event ingestion and routing pipeline. For KQL database operations, use
eventhouse-authoring-clioreventhouse-consumption-cli.
| Task | Reference | Notes |
|---|---|---|
| Finding Workspaces and Items in Fabric | COMMON-CLI.md § Finding Workspaces and Items in Fabric | Mandatory — READ link first [needed for finding workspace id by its name or item id by its name, item type, and workspace id] |
| Fabric Topology & Key Concepts | COMMON-CORE.md § Fabric Topology & Key Concepts | |
| Environment URLs | COMMON-CORE.md § Environment URLs | |
| Authentication & Token Acquisition | COMMON-CORE.md § Authentication & Token Acquisition | Wrong audience = 401; read before any auth issue |
| Core Control-Plane REST APIs | COMMON-CORE.md § Core Control-Plane REST APIs | Includes pagination, LRO polling, and rate-limiting patterns |
| Gotchas, Best Practices & Troubleshooting | COMMON-CORE.md § Gotchas, Best Practices & Troubleshooting | |
| Tool Selection Rationale | COMMON-CLI.md § Tool Selection Rationale | |
| Authentication Recipes | COMMON-CLI.md § Authentication Recipes | az login flows and token acquisition |
Fabric Control-Plane API via az rest | COMMON-CLI.md § Fabric Control-Plane API via az rest | Always pass --resource; includes pagination and LRO helpers |
| Gotchas & Troubleshooting (CLI-Specific) | COMMON-CLI.md § Gotchas & Troubleshooting (CLI-Specific) | az rest audience, shell escaping, token expiry |
| Quick Reference | COMMON-CLI.md § Quick Reference | az rest template + token audience/tool matrix |
| Eventstream Resource Model | EVENTSTREAM-AUTHORING-CORE.md § Eventstream Resource Model | Read first — graph-based topology with sources, operators, streams, destinations |
| Source Configuration | EVENTSTREAM-AUTHORING-CORE.md § Source Configuration | 25 API-supported source types with per-source properties |
| Transformation Operators | EVENTSTREAM-AUTHORING-CORE.md § Transformation Operators | 8 operator types: Filter, Aggregate, GroupBy, Join, ManageFields, Union, Expand, SQL |
| Destination Configuration | EVENTSTREAM-AUTHORING-CORE.md § Destination Configuration | 4 API-supported destination types with node schema |
| Stream Types | EVENTSTREAM-AUTHORING-CORE.md § Stream Types | DefaultStream (auto) and DerivedStream (from operators) |
| Eventstream Lifecycle (REST API) | EVENTSTREAM-AUTHORING-CORE.md § Eventstream Lifecycle (REST API) | CRUD + Definition endpoints |
| Item Definitions and Deployment | EVENTSTREAM-AUTHORING-CORE.md § Item Definitions and Deployment | Base64 encoding pattern for eventstream.json |
| Gotchas and Limitations | EVENTSTREAM-AUTHORING-CORE.md § Gotchas and Limitations | Max 11 custom endpoints, base64 encoding, naming constraints |
| Create an Eventstream | SKILL.md § Create an Eventstream | |
| Deploy Full Topology | SKILL.md § Deploy Full Topology | End-to-end: build topology JSON → base64 encode → submit definition |
| Update Eventstream Topology | SKILL.md § Update Eventstream Topology | |
| Delete an Eventstream | SKILL.md § Delete an Eventstream | |
| Gotchas, Rules, Troubleshooting | SKILL.md § Gotchas, Rules, Troubleshooting | MUST DO / AVOID / PREFER checklists |
Create an empty Eventstream item, then configure it with sources, destinations, and operators via the definition API.
az rest --method POST \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WORKSPACE_ID}/eventstreams" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Type=application/json" \
--body '{"displayName": "my-eventstream", "description": "IoT sensor pipeline"}'
Save the returned id as EVENTSTREAM_ID.
Construct the eventstream.json topology with sources, streams, operators, and destinations. Each node references its upstream via inputNodes.
Prefer building the JSON programmatically to avoid serialization errors. Key rules:
inputNodes[].nameinputSerialization in propertiesBase64-encode the topology JSON and submit via the definition API. See Item Definitions and Deployment for the full payload structure.
For deploying a complete Eventstream with topology in a single API call, use the Create Item with Definition endpoint:
# 1. Build eventstream.json content (topology)
TOPOLOGY_JSON='{"compatibilityLevel":"1.0","sources":[...],"streams":[...],"operators":[...],"destinations":[...]}'
# 2. Build eventstreamProperties.json (optional — controls retention and throughput)
PROPERTIES_JSON='{"retentionTimeInDays":1,"eventThroughputLevel":"Low","schemaMode":"Inferred"}'
# 3. Base64-encode both (no line wraps)
TOPOLOGY_B64=$(echo -n "$TOPOLOGY_JSON" | base64 -w 0)
PROPERTIES_B64=$(echo -n "$PROPERTIES_JSON" | base64 -w 0)
# 4. Submit via Items API
az rest --method POST \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WORKSPACE_ID}/items" \
--resource "https://api.fabric.microsoft.com" \
--headers "Content-Type=application/json" \
--body "{
\"displayName\": \"my-eventstream\",
\"type\": \"Eventstream\",
\"definition\": {
\"parts\": [
{
\"path\": \"eventstream.json\",
\"payload\": \"${TOPOLOGY_B64}\",
\"payloadType\": \"InlineBase64\"
},
{
\"path\": \"eventstreamProperties.json\",
\"payload\": \"${PROPERTIES_B64}\",
\"payloadType\": \"InlineBase64\"
}
]
}
}"
Note: If
eventstreamProperties.jsonis omitted, the API applies defaults:retentionTimeInDays: 1,eventThroughputLevel: "Low",schemaMode: "Inferred". Include it explicitly to control retention (1–90 days) and throughput.
On Windows (PowerShell), use
[Convert]::ToBase64String([Text.Encoding]::UTF8.GetBytes($json))for base64 encoding.
GET /v1/workspaces/{wsId}/eventstreams/{esId}/definitioneventstream.json payload from base64PUT /v1/workspaces/{wsId}/eventstreams/{esId}/definitionThe Update Definition API returns 202 Accepted for long-running operations. Poll the Location header URL until completion.
az rest --method DELETE \
--url "https://api.fabric.microsoft.com/v1/workspaces/${WORKSPACE_ID}/eventstreams/${EVENTSTREAM_ID}" \
--resource "https://api.fabric.microsoft.com"
Returns 200 OK on success.
eventstream.json payload before submitting definitions--resource https://api.fabric.microsoft.com with az rest calls202 Accepted with a Location headerSampleData source type for testing and prototypingretentionTimeInDays explicitly rather than relying on defaults