Spring Batch: Introduction, Architecture, Core Interfaces, and Practical Implementation Guide
This article provides a comprehensive overview of Spring Batch, covering its purpose, typical business scenarios, core components such as JobRepository, JobLauncher, Job, Step, and core interfaces like ItemReader, ItemProcessor, ItemWriter, followed by detailed code examples for configuration, multi‑step, parallel, decision, nested jobs, data reading, writing, processing, and scheduling using Spring Boot.
Spring Batch is a lightweight, comprehensive batch‑processing framework built on Spring, designed for robust enterprise batch jobs but not a scheduler itself; it integrates with external schedulers like Quartz.
Business scenarios supported by Spring Batch include:
Periodic batch submission
Concurrent batch processing
Stage‑wise message‑driven processing
Massively parallel batch jobs
Manual or scheduled restarts after failures
Ordered step execution (workflow‑driven)
Skipping records (e.g., on rollback)
Full‑batch transactions for small batches or existing stored procedures
Core architecture components (from the official documentation):
Name
Purpose
JobRepositoryProvides persistence for Job, JobInstance, and Step metadata.
JobLauncherInterface used to launch a
Jobwith a set of
JobParameters.
JobEncapsulates the entire batch process.
StepRepresents an independent phase of a job.
Core interfaces:
ItemReader : reads a chunk of items for a Step .
ItemProcessor : processes each item.
ItemWriter : writes a chunk of processed items.
The typical flow is ItemReader → ItemProcessor → ItemWriter , where a Job contains multiple Step s.
1. Adding Spring Batch to a Spring Boot project (pom.xml):
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.5.RELEASE</version>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-validation</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
</dependencies>Enable batch processing in the main class:
@SpringBootApplication
@EnableBatchProcessing
public class SpringBatchStartApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchStartApplication.class, args);
}
}2. Defining a simple job (FirstJobDemo):
@Component
public class FirstJobDemo {
@Autowired private JobBuilderFactory jobBuilderFactory;
@Autowired private StepBuilderFactory stepBuilderFactory;
@Bean
public Job firstJob() {
return jobBuilderFactory.get("firstJob")
.start(step())
.build();
}
private Step step() {
return stepBuilderFactory.get("step")
.tasklet((contribution, chunkContext) -> {
System.out.println("执行步骤....");
return RepeatStatus.FINISHED;
}).build();
}
}3. Multi‑step job with flow control:
@Bean
public Job multiStepJob() {
return jobBuilderFactory.get("multiStepJob2")
.start(step1())
.on(ExitStatus.COMPLETED.getExitCode()).to(step2())
.from(step2()).on(ExitStatus.COMPLETED.getExitCode()).to(step3())
.from(step3()).end()
.build();
}
private Step step1() { /* tasklet prints "执行步骤一操作。。。" */ }
private Step step2() { /* tasklet prints "执行步骤二操作。。。" */ }
private Step step3() { /* tasklet prints "执行步骤三操作。。。" */ }4. Parallel execution using split:
@Bean
public Job splitJob() {
return jobBuilderFactory.get("splitJob")
.start(flow1())
.split(new SimpleAsyncTaskExecutor()).add(flow2())
.end()
.build();
}
private Flow flow1() { return new FlowBuilder
("flow1").start(step1()).next(step2()).build(); }
private Flow flow2() { return new FlowBuilder
("flow2").start(step3()).build(); }5. Decision making with a JobExecutionDecider:
public class MyDecider implements JobExecutionDecider {
@Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
DayOfWeek day = LocalDate.now().getDayOfWeek();
return (day == DayOfWeek.SATURDAY || day == DayOfWeek.SUNDAY)
? new FlowExecutionStatus("weekend")
: new FlowExecutionStatus("workingDay");
}
}Job configuration using the decider routes to different steps based on the returned status.
6. Nested jobs (parent‑child relationship):
@Bean
public Job parentJob() {
return jobBuilderFactory.get("parentJob")
.start(childJobOneStep())
.next(childJobTwoStep())
.build();
}
private Step childJobOneStep() {
return new JobStepBuilder(new StepBuilder("childJobOneStep"))
.job(childJobOne())
.launcher(jobLauncher)
.repository(jobRepository)
.transactionManager(platformTransactionManager)
.build();
}
private Job childJobOne() { /* simple step that prints a message */ }7. Reading data (FlatFileItemReader example):
@Component
public class FileItemReaderDemo {
@Autowired private JobBuilderFactory jobBuilderFactory;
@Autowired private StepBuilderFactory stepBuilderFactory;
@Bean
public Job fileItemReaderJob() {
return jobBuilderFactory.get("fileItemReaderJob2")
.start(step())
.build();
}
private Step step() {
return stepBuilderFactory.get("step")
.
chunk(2)
.reader(fileItemReader())
.writer(list -> list.forEach(System.out::println))
.build();
}
private ItemReader
fileItemReader() {
FlatFileItemReader
reader = new FlatFileItemReader<>();
reader.setResource(new ClassPathResource("reader/file"));
reader.setLinesToSkip(1);
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
tokenizer.setNames("id", "field1", "field2", "field3");
DefaultLineMapper
mapper = new DefaultLineMapper<>();
mapper.setLineTokenizer(tokenizer);
mapper.setFieldSetMapper(fieldSet -> {
TestData data = new TestData();
data.setId(fieldSet.readInt("id"));
data.setField1(fieldSet.readString("field1"));
data.setField2(fieldSet.readString("field2"));
data.setField3(fieldSet.readString("field3"));
return data;
});
reader.setLineMapper(mapper);
return reader;
}
}8. Writing data to a JSON file (FlatFileItemWriter example):
@Component
public class FileItemWriterDemo {
@Autowired private JobBuilderFactory jobBuilderFactory;
@Autowired private StepBuilderFactory stepBuilderFactory;
@Resource(name = "writerSimpleReader") private ListItemReader
writerSimpleReader;
@Bean
public Job fileItemWriterJob() throws Exception {
return jobBuilderFactory.get("fileItemWriterJob")
.start(step())
.build();
}
private Step step() throws Exception {
return stepBuilderFactory.get("step")
.
chunk(2)
.reader(writerSimpleReader)
.writer(fileItemWriter())
.build();
}
private FlatFileItemWriter
fileItemWriter() throws Exception {
FlatFileItemWriter
writer = new FlatFileItemWriter<>();
FileSystemResource file = new FileSystemResource("D:/code/spring-batch-demo/src/main/resources/writer/writer-file");
Path path = Paths.get(file.getPath());
if (!Files.exists(path)) { Files.createFile(path); }
writer.setResource(file);
writer.setLineAggregator(item -> {
try { return new ObjectMapper().writeValueAsString(item); }
catch (JsonProcessingException e) { e.printStackTrace(); }
return "";
});
writer.afterPropertiesSet();
return writer;
}
}9. Item processing with validation (BeanValidatingItemProcessor):
@Component
public class ValidatingItemProcessorDemo {
@Autowired private JobBuilderFactory jobBuilderFactory;
@Autowired private StepBuilderFactory stepBuilderFactory;
@Resource(name = "processorSimpleReader") private ListItemReader
processorSimpleReader;
@Bean
public Job validatingItemProcessorJob() throws Exception {
return jobBuilderFactory.get("validatingItemProcessorJob3")
.start(step())
.build();
}
private Step step() throws Exception {
return stepBuilderFactory.get("step")
.
chunk(2)
.reader(processorSimpleReader)
.processor(beanValidatingItemProcessor())
.writer(list -> list.forEach(System.out::println))
.build();
}
private BeanValidatingItemProcessor
beanValidatingItemProcessor() throws Exception {
BeanValidatingItemProcessor
processor = new BeanValidatingItemProcessor<>();
// processor.setFilter(true); // enable filtering of invalid items
processor.afterPropertiesSet();
return processor;
}
}10. Scheduling the job via a REST controller (can be combined with Quartz or XXL‑Job):
@RestController
@RequestMapping("job")
public class JobController {
@Autowired private Job job;
@Autowired private JobLauncher jobLauncher;
@GetMapping("launcher/{message}")
public String launcher(@PathVariable String message) throws Exception {
JobParameters params = new JobParametersBuilder()
.addString("message", message)
.toJobParameters();
jobLauncher.run(job, params);
return "success";
}
}The article concludes with author information and promotional links, which are not part of the technical content.
Architect's Guide
Dedicated to sharing programmer-architect skills—Java backend, system, microservice, and distributed architectures—to help you become a senior architect.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.