Spring Boot Integration with Spring Batch: A Complete Tutorial and Example
This article provides a step‑by‑step guide on integrating Spring Batch with Spring Boot, covering business scenarios, database setup, Maven dependencies, configuration of JobRepository, JobLauncher, Job, Step, ItemReader, ItemProcessor, ItemWriter, listeners, validators, execution via REST endpoints, troubleshooting with Druid and switching to HikariCP, and demonstrates processing CSV and database data in large batches.
The article introduces Spring Batch as a robust batch‑processing framework built on Spring, highlighting its ease of integration, transaction management, logging, retry, and skip capabilities for handling large data volumes.
Two typical business scenarios are presented: reading data from a CSV file and from a database table, processing the data, and persisting the results.
It shows how to create a simple MySQL table bloginfo and provides the SQL DDL, followed by the essential Maven dependencies required for Spring Boot, Spring Batch, MyBatis, Druid, and validation libraries:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
... (other dependencies omitted for brevity) ...
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid-spring-boot-starter</artifactId>
<version>1.1.18</version>
</dependency>The application.yml configuration for Spring Batch and the Druid datasource is also provided.
A POJO BlogInfo and a MyBatis mapper BlogMapper are defined, with example CRUD methods using annotations.
The core configuration class MyBatchConfig defines all batch components:
JobRepository bean with MySQL database type.
JobLauncher bean linked to the JobRepository.
Job bean that assembles a Step and attaches a listener.
Step bean that configures a chunk size, ItemReader, ItemProcessor, ItemWriter, retry and skip policies, and registers read/write listeners.
ItemReader using FlatFileItemReader to read CSV rows into BlogInfo objects.
ItemProcessor extending ValidatingItemProcessor to apply simple business logic and JSR‑303 validation.
Custom validator MyBeanValidator implementing Spring Batch's Validator interface.
ItemWriter using JdbcBatchItemWriter with a parameterized INSERT statement.
Job, read, and write listeners that log start, end, and error events.
REST controllers TestController and a second controller for the new job expose endpoints /testJob and /testJobNew to launch the batch jobs with optional parameters.
Additional examples demonstrate a second job that reads from the database using MyBatisCursorItemReader , processes records with a new processor MyItemProcessorNew , and writes to a new table bloginfonew . The article also discusses a runtime issue with the Druid connection pool and recommends switching to the default HikariCP for better compatibility.
Finally, the article concludes with screenshots of successful job executions, database tables, and a reminder to keep discussions professional and focused on technical exchange.
Selected Java Interview Questions
A professional Java tech channel sharing common knowledge to help developers fill gaps. Follow us!
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.