spring-batch-rs
Stop writing batch boilerplate. Start processing data.
Processing a large CSV into a database? You end up writing readers, chunk logic, error loops, retry handling — just to move data. Spring Batch RS handles the plumbing: you define what to read, what to transform, where to write. Skip policies, execution metrics, and fault tolerance come built-in.
Quick Start
1. Add to Cargo.toml
[]
= { = "0.3", = ["csv", "json"] }
= { = "1.0", = ["derive"] }
2. Your first batch job (CSV → JSON)
Note:
rdbc-*andormfeatures requiretokio = { version = "1", features = ["full"] }. See the Getting Started guide for the async setup.
use ;
use ;
use temp_dir;
How It Works
A Job contains one or more Steps. Each Step reads items one by one from a source, buffers them into a configurable chunk, then writes the whole chunk at once — balancing throughput with memory usage.
Read item → Read item → ... → [chunk full] → Write chunk → repeat
Why spring-batch-rs
- Chunk-oriented processing — reads one item at a time, writes in batches. Memory usage stays constant regardless of dataset size.
- Fault tolerance built-in — set a
skip_limitto keep processing when bad rows appear. No manual try/catch loops. - Type-safe pipelines — reader, processor, and writer types are verified at compile time. Mismatched types don't compile.
- Modular by design — enable only what you need via feature flags. No unused dependencies.
Features
Formats
| Feature | Description |
|---|---|
csv |
CSV ItemReader and ItemWriter |
json |
JSON ItemReader and ItemWriter |
xml |
XML ItemReader and ItemWriter |
Databases (require tokio — see Getting Started)
| Feature | Description |
|---|---|
rdbc-postgres |
PostgreSQL ItemReader and ItemWriter |
rdbc-mysql |
MySQL / MariaDB ItemReader and ItemWriter |
rdbc-sqlite |
SQLite ItemReader and ItemWriter |
mongodb |
MongoDB ItemReader and ItemWriter (sync) |
orm |
SeaORM ItemReader and ItemWriter |
Utilities
| Feature | Description |
|---|---|
zip |
ZIP compression Tasklet |
ftp |
FTP / FTPS Tasklet |
fake |
Fake data ItemReader for generating test datasets |
logger |
Logger ItemWriter for debugging pipelines |
full |
All of the above |
Examples
| Use case | Run |
|---|---|
| CSV → JSON | cargo run --example csv_processing --features csv,json |
| JSON processing | cargo run --example json_processing --features json,csv,logger |
| XML processing | cargo run --example xml_processing --features xml,json,csv |
| CSV → SQLite | cargo run --example database_processing --features rdbc-sqlite,csv,json,logger |
| MongoDB | cargo run --example mongodb_processing --features mongodb,csv,json |
| SeaORM | cargo run --example orm_processing --features orm,csv,json |
| Advanced ETL pipeline | cargo run --example advanced_patterns --features csv,json,logger |
| ZIP tasklet | cargo run --example tasklet_zip --features zip |
| FTP tasklet | cargo run --example tasklet_ftp --features ftp |
Database examples require Docker. Browse the full examples gallery for tutorials and advanced patterns.
Documentation
| Resource | Link |
|---|---|
| Getting Started | sboussekeyt.github.io/…/getting-started |
| Item Readers & Writers | sboussekeyt.github.io/…/item-readers-writers |
| API Reference | docs.rs/spring-batch-rs |
| Architecture | sboussekeyt.github.io/…/architecture |
Community
- Discord — Chat with the community
- GitHub Issues — Bug reports and feature requests
- GitHub Discussions — Questions and ideas
License
Licensed under MIT or Apache-2.0 at your option.