From magic-powers
Use when building Apache Beam pipelines on Google Cloud Dataflow — batch ETL, streaming, windowing, triggers, or Dataflow vs Dataproc decisions. Covers GCP-PDE domain: Ingest and process data (~25-30%).
npx claudepluginhub kienbui1995/magic-powers --plugin magic-powersThis skill uses the workspace's default tool permissions.
- Building ETL pipelines (batch or streaming) on GCP
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
| Factor | Choose Dataflow | Choose Dataproc |
|---|---|---|
| Runtime | Apache Beam pipelines | Spark/Hadoop ecosystem |
| Management | Fully managed, serverless | Cluster to manage (or autoscaling) |
| Streaming | Native (Pub/Sub → BQ) | Spark Streaming (more complex) |
| Existing code | Greenfield | Migrating existing Spark jobs |
| Cost model | Per vCPU/memory/hour | Cluster uptime |
Core abstractions:
.withAllowedLateness(Duration.standardMinutes(10)).orFinally(), .repeatedly().withAllowedLateness()?.withAllowedLateness() to capture itDirectRunner = local testing; DataflowRunner = GCP execution