From jeremylongshore-claude-code-plugins-plus-skills
Builds Beam pipeline operations for data pipelines, covering ETL, data transformation, workflow orchestration, and streaming processing. Activates on 'beam pipeline builder' phrases.
npx claudepluginhub jeremylongshore/claude-code-plugins-plus-skills --plugin langchain-py-packThis skill is limited to using the following tools:
This skill provides automated assistance for beam pipeline builder tasks within the Data Pipelines domain.
Guides writing, packaging, and executing Apache Beam pipelines on GCP Dataflow, including Flex Templates for Java, Python, and Go projects.
Generates code, configurations, and guidance for Spark job creation in data pipelines covering ETL, transformations, orchestration, and streaming processing.
Builds production Apache Airflow DAGs with best practices for operators, sensors, testing, and deployment. Use for data pipelines, workflow orchestration, or batch jobs.
Share bugs, ideas, or general feedback.
This skill provides automated assistance for beam pipeline builder tasks within the Data Pipelines domain.
This skill activates automatically when you:
Example: Basic Usage Request: "Help me with beam pipeline builder" Result: Provides step-by-step guidance and generates appropriate configurations
| Error | Cause | Solution |
|---|---|---|
| Configuration invalid | Missing required fields | Check documentation for required parameters |
| Tool not found | Dependency not installed | Install required tools per prerequisites |
| Permission denied | Insufficient access | Verify credentials and permissions |
Part of the Data Pipelines skill category. Tags: etl, airflow, spark, streaming, data-engineering