From spring
Design, deploy, and operate Spring Cloud Data Flow streams and tasks with app registration, stream DSL, task launch, schedules, platform accounts, and pipeline operations. Use this skill when designing, deploying, and operating Spring Cloud Data Flow streams or tasks, including app registration, stream DSL, task launch, schedules, platform accounts, and pipeline operations.
npx claudepluginhub ririnto/sinon --plugin springThis skill uses the workspace's default tool permissions.
Use this skill when designing, deploying, and operating Spring Cloud Data Flow streams or tasks, including app registration, stream DSL, task launch, schedules, platform accounts, and pipeline operations.
Mandates invoking relevant skills via tools before any response in coding sessions. Covers access, priorities, and adaptations for Claude Code, Copilot CLI, Gemini CLI.
Share bugs, ideas, or general feedback.
Use this skill when designing, deploying, and operating Spring Cloud Data Flow streams or tasks, including app registration, stream DSL, task launch, schedules, platform accounts, and pipeline operations.
Use spring-cloud-data-flow for SCDF server and shell workflows, stream and task topology design, app registration, deployment properties, and runtime pipeline operations.
spring-integration for in-process integration flows inside one application.The ordinary Spring Cloud Data Flow job is:
SCDF is primarily an external orchestration platform, not a business-app dependency. Custom apps normally depend on Spring Cloud Stream or Spring Cloud Task rather than an SCDF library.
The current stable SCDF server line is 2.11.5. Keep examples on the stable GA line unless the task explicitly targets a newer milestone or snapshot.
SCDF server and shell are operated outside the business application.
Custom stream or task apps use their own Spring Boot + Spring Cloud dependencies.
dataflow:>app register --name http --type source --uri maven://org.springframework.cloud.stream.app:http-source-rabbit:3.2.1
dataflow:>app register --name log --type sink --uri maven://org.springframework.cloud.stream.app:log-sink-rabbit:3.2.1
dataflow:>stream create --name http-log --definition "http | log"
dataflow:>stream deploy --name http-log
dataflow:>task create --name import-customers --definition "import-task"
dataflow:>task launch --name import-customers
dataflow:>app register --name http --type source --uri maven://org.springframework.cloud.stream.app:http-source-rabbit:3.2.1
dataflow:>app register --name log --type sink --uri maven://org.springframework.cloud.stream.app:log-sink-rabbit:3.2.1
dataflow:>stream create --name http-log --definition "http | log"
dataflow:>stream deploy --name http-log
dataflow:>stream list
dataflow:>runtime apps
@SpringBootApplication
@EnableTask
public class ImportTaskApplication {
public static void main(String[] args) {
SpringApplication.run(ImportTaskApplication.class, args);
}
@Bean
CommandLineRunner importCustomers(CustomerImporter importer) {
return args -> importer.run();
}
}
This task-app shape assumes a Spring Cloud Task application and therefore keeps @EnableTask explicit instead of presenting the app as a plain Boot command-line process.
dataflow:>task create --name import-customers --definition "import-task"
dataflow:>task launch --name import-customers --arguments "--input=file:/data/customers.csv"
dataflow:>task execution list
Return these artifacts for the ordinary path:
http | log
--input=file:/data/customers.csv
runtime apps
task execution list