From spring
Build and operate Spring Batch jobs with job/step configuration, chunk or tasklet processing, restartability, reader or writer choices, and scaling patterns. Use this skill when building or operating Spring Batch jobs with job and step configuration, chunk or tasklet processing, job parameters, restartability, reader or writer choices, scaling patterns, and batch-focused tests.
npx claudepluginhub ririnto/sinon --plugin springThis skill uses the workspace's default tool permissions.
Use this skill when building or operating Spring Batch jobs with job and step configuration, chunk or tasklet processing, job parameters, restartability, reader or writer choices, scaling patterns, and batch-focused tests.
references/fault-tolerance-and-transaction-tuning.mdreferences/integration-driven-launch.mdreferences/job-infrastructure-launch-and-recovery.mdreferences/observability-and-monitoring.mdreferences/readers-writers-and-item-streams.mdreferences/scaling-partitioning-and-remote-execution.mdreferences/spring-batch-6-migration.mdreferences/step-flow-and-listeners.mdreferences/testing-batch-jobs-and-step-scope.mdMandates invoking relevant skills via tools before any response in coding sessions. Covers access, priorities, and adaptations for Claude Code, Copilot CLI, Gemini CLI.
Share bugs, ideas, or general feedback.
Use this skill when building or operating Spring Batch jobs with job and step configuration, chunk or tasklet processing, job parameters, restartability, reader or writer choices, scaling patterns, and batch-focused tests.
Use spring-batch for scheduled or launched batch jobs, chunk and tasklet steps, restart semantics, metadata-backed execution, and large-scale record processing.
spring-integration for general message-driven integration flows rather than batch job orchestration.spring-data when the main task is repository design rather than batch orchestration.The ordinary Spring Batch job is:
@StepScope or @JobScope only when parameters or execution context must resolve at runtime.Use the Boot starter for application code and the Batch test module for job and step tests.
For the latest released line, Spring Batch itself is 6.0.3. The stable Spring Boot 3.4.x line still manages Spring Batch 5.2.x, so Batch 6-specific APIs require either a direct Spring Batch 6.x path or a Spring Boot line that has moved to Batch 6.
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
Treat the Boot-managed path and the standalone current-line path as different compatibility branches. Verify the actual Batch line before copying infrastructure or migration examples that depend on Batch 6 behavior.
Keep the batch runtime vocabulary explicit:
| Type | Role |
|---|---|
JobInstance | one logical run identity defined by the identifying job parameters |
JobExecution | one concrete execution attempt of a job instance |
JobParameters | launch parameters that define identity or operational knobs |
ExecutionContext | restart state persisted across step or job execution |
JobRepository | metadata store for instances, executions, and step state |
JobOperator | operational control surface for launch, stop, restart, and recovery |
The minimum Spring Batch model is Job -> Step -> chunk or tasklet.
On Spring Boot's Batch 5.2.x path (current Boot 3.x), @EnableBatchProcessing alone provides the framework-managed JobRepository and transaction manager backed by the Boot DataSource:
@Configuration
@EnableBatchProcessing
class BatchInfrastructureConfiguration {
}
On Spring Batch 6, @EnableBatchProcessing no longer assumes a JDBC-backed store. It configures the common batch infrastructure and defaults to a ResourcelessJobRepository plus ResourcelessTransactionManager (in-memory, non-persistent). Opt into a persistent backend explicitly with one of the store-specific annotations:
@Configuration
@EnableBatchProcessing
@EnableJdbcJobRepository
class BatchInfrastructureConfiguration {
}
@Configuration
@EnableBatchProcessing
@EnableMongoJobRepository
class BatchInfrastructureConfiguration {
}
@EnableJdbcJobRepository and @EnableMongoJobRepository attributes (dataSourceRef, transactionManagerRef, tablePrefix, etc.) are all optional; add them only when overriding the defaults. Change repository strategy only when operations, scale, or platform constraints require it. Open the infrastructure reference before adopting Batch 6-specific migration behavior beyond these annotations.
@Configuration
class ImportJobConfiguration {
@Bean
Job importJob(JobRepository repository, Step importStep) {
return new JobBuilder("importJob", repository)
.start(importStep)
.build();
}
@Bean
Step importStep(JobRepository repository, PlatformTransactionManager tx, ItemReader<CustomerInput> reader, ItemProcessor<CustomerInput, Customer> processor, ItemWriter<Customer> writer) {
return new StepBuilder("importStep", repository)
.<CustomerInput, Customer>chunk(100, tx)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}
}
The ordinary item-oriented path is reader + optional processor + writer.
ItemProcessor seam.Use late binding only when runtime parameters or execution context must resolve at step creation time.
@Bean
@StepScope
FlatFileItemReader<CustomerInput> customerReader(@Value("#{jobParameters['input']}") Resource input) {
return new FlatFileItemReaderBuilder<CustomerInput>()
.name("customerReader")
.resource(input)
.delimited()
.names("email", "name")
.targetType(CustomerInput.class)
.build();
}
Make restart behavior explicit before tuning performance.
JobInstance and which are only operational knobs.ExecutionContext or restart-safe ItemStream implementations..faultTolerant()
.skipLimit(10)
.skip(FlatFileParseException.class)
.retryLimit(3)
.retry(DeadlockLoserDataAccessException.class)
Choose the smallest scaling model that solves throughput before reaching for remote patterns.
| Need | Start here |
|---|---|
| one step is sufficient | single-threaded chunk or tasklet step |
| one job has independent branches | parallel flows |
| one step needs more local throughput | multithreaded step |
| input can be split into isolated slices | partitioning |
| work must cross process boundaries | remote chunking or remote step execution |
Open the scaling reference only after the single-step path is correct and measured.
Start with one end-to-end job test and one restart or failure-path test.
@SpringBatchTest
@SpringJUnitConfig(ImportJobConfiguration.class)
class ImportJobTests {
@Autowired
private JobOperatorTestUtils jobOperatorTestUtils;
@Test
void jobCompletes(@Autowired Job importJob) throws Exception {
jobOperatorTestUtils.setJob(importJob);
JobExecution execution = jobOperatorTestUtils.startJob(new JobParametersBuilder().addString("input", "classpath:/customers.csv").toJobParameters());
assertEquals("COMPLETED", execution.getExitStatus().getExitCode());
}
}
JobRepository, JobOperator, parameter identity, restart versus rerun, metadata, recovery, graceful shutdown, or operational control.ItemStream, or restart-safe file, database, JSON, XML, or messaging pipeline.spring-batch-test, scoped component tests, failure-path assertions, restart tests, or metadata-driven test setup.