From systems-design
Designs data pipeline architectures for batch ETL, streaming, or hybrid scenarios including tech stacks, ASCII diagrams, data quality strategies, and cost analysis. Useful for real-time processing, BI reporting, or migrations.
npx claudepluginhub melodic-software/claude-code-plugins --plugin systems-designThis skill is limited to using the following tools:
Design a data pipeline architecture based on requirements description.
Design data movement and transformation pipelines. Show how data flows between systems, transforms, and where it's stored. Use when architecting data integrations or ETL processes.
Builds scalable data pipelines, modern data warehouses, and real-time streaming architectures using Apache Spark, dbt, Airflow, and cloud-native platforms.
Builds scalable data pipelines, modern data warehouses, and real-time streaming architectures using Spark, dbt, Airflow, and cloud platforms. For pipeline design, analytics infrastructure, modern data stacks.
Share bugs, ideas, or general feedback.
Design a data pipeline architecture based on requirements description.
/sd:data-flow <description>
description (required): Natural language description of data flow requirements
/sd:data-flow Real-time customer activity tracking from web and mobile to analytics dashboard
/sd:data-flow Batch ETL from 5 PostgreSQL databases to Snowflake for BI reporting
/sd:data-flow Event streaming from IoT sensors with 10ms latency requirement for anomaly detection
/sd:data-flow Migrate legacy Oracle data warehouse to cloud lakehouse architecture
Parse Requirements Extract from description:
Spawn Data Architect Agent
Use the data-architect agent to design the pipeline. The agent will:
Present Architecture Display the design including:
The command may recommend:
The command produces a comprehensive design document with: