1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
//#![warn(missing_docs)]
/*!
<div align="center">
<h1>spring-batch-rs</h1>
<h3>Stop writing batch boilerplate. Start processing data.</h3>
[](https://crates.io/crates/spring-batch-rs)
[](https://docs.rs/spring-batch-rs)
[](https://github.com/sboussekeyt/spring-batch-rs/actions/workflows/test.yml)
[](https://discord.gg/9FNhawNsG6)
[](https://codecov.io/gh/sboussekeyt/spring-batch-rs)

</div>
Processing a large CSV into a database? You end up writing readers, chunk logic, error
loops, retry handling — just to move data. **Spring Batch RS** handles the plumbing: you
define what to read, what to transform, where to write. Skip policies, execution metrics,
and fault tolerance come built-in.
## Quick Start
### 1. Add to `Cargo.toml`
```toml
[dependencies]
spring-batch-rs = { version = "0.3", features = ["csv", "json"] }
serde = { version = "1.0", features = ["derive"] }
```
### 2. Your first batch job (CSV → JSON)
> **Note:** `rdbc-*` and `orm` features require `tokio = { version = "1", features = ["full"] }`.
> See the [Getting Started guide](https://spring-batch-rs.boussekeyt.dev/getting-started/) for the async setup.
```rust,no_run
use spring_batch_rs::{
core::{job::{Job, JobBuilder}, step::StepBuilder, item::PassThroughProcessor},
item::{
csv::csv_reader::CsvItemReaderBuilder,
json::json_writer::JsonItemWriterBuilder,
},
BatchError,
};
use serde::{Deserialize, Serialize};
use std::env::temp_dir;
#[derive(Deserialize, Serialize, Clone)]
struct Order {
id: u32,
amount: f64,
status: String,
}
fn main() -> Result<(), BatchError> {
let csv = "id,amount,status\n1,99.5,pending\n2,14.0,complete\n3,bad,pending";
// Read from CSV
let reader = CsvItemReaderBuilder::<Order>::new()
.has_headers(true)
.from_reader(csv.as_bytes());
// Write to JSON
let output = temp_dir().join("orders.json");
let writer = JsonItemWriterBuilder::<Order>::new()
.from_path(&output);
// Wire together: read 100 items at a time, tolerate up to 5 bad rows
let processor = PassThroughProcessor::<Order>::new();
let step = StepBuilder::new("csv-to-json")
.chunk::<Order, Order>(100)
.reader(&reader)
.processor(&processor)
.writer(&writer)
.skip_limit(5)
.build();
JobBuilder::new().start(&step).build().run().map(|_| ())?;
println!("Output: {}", output.display());
Ok(())
}
```
## How It Works
A **Job** contains one or more **Steps**. Each Step reads items one by one from a source,
buffers them into a configurable chunk, then writes the whole chunk at once — balancing
throughput with memory usage.
```text
Read item → Read item → ... → [chunk full] → Write chunk → repeat
```
## Why spring-batch-rs
- **Chunk-oriented processing** — reads one item at a time, writes in batches. Memory usage stays constant regardless of dataset size.
- **Fault tolerance built-in** — set a `skip_limit` to keep processing when bad rows appear. No manual try/catch loops.
- **Type-safe pipelines** — reader, processor, and writer types are verified at compile time. Mismatched types don't compile.
- **Modular by design** — enable only what you need via feature flags. No unused dependencies.
## Features
**Formats**
| Feature | Description |
| ------- | ----------- |
| `csv` | CSV `ItemReader` and `ItemWriter` |
| `json` | JSON `ItemReader` and `ItemWriter` |
| `xml` | XML `ItemReader` and `ItemWriter` |
**Databases** *(require `tokio` — see [Getting Started](https://spring-batch-rs.boussekeyt.dev/getting-started/))*
| Feature | Description |
| --------------- | ----------- |
| `rdbc-postgres` | PostgreSQL `ItemReader` and `ItemWriter` |
| `rdbc-mysql` | MySQL / MariaDB `ItemReader` and `ItemWriter` |
| `rdbc-sqlite` | SQLite `ItemReader` and `ItemWriter` |
| `mongodb` | MongoDB `ItemReader` and `ItemWriter` (sync) |
| `orm` | SeaORM `ItemReader` and `ItemWriter` |
**Utilities**
| Feature | Description |
| -------- | ----------- |
| `zip` | ZIP compression `Tasklet` |
| `ftp` | FTP / FTPS `Tasklet` |
| `fake` | Fake data `ItemReader` for generating test datasets |
| `logger` | Logger `ItemWriter` for debugging pipelines |
| `full` | All of the above |
## Examples
| Use case | Run |
| -------- | --- |
| CSV → JSON | `cargo run --example csv_processing --features csv,json` |
| JSON processing | `cargo run --example json_processing --features json,csv,logger` |
| XML processing | `cargo run --example xml_processing --features xml,json,csv` |
| CSV → SQLite | `cargo run --example database_processing --features rdbc-sqlite,csv,json,logger` |
| MongoDB | `cargo run --example mongodb_processing --features mongodb,csv,json` |
| SeaORM | `cargo run --example orm_processing --features orm,csv,json` |
| Advanced ETL pipeline | `cargo run --example advanced_patterns --features csv,json,logger` |
| ZIP tasklet | `cargo run --example tasklet_zip --features zip` |
| FTP tasklet | `cargo run --example tasklet_ftp --features ftp` |
> Database examples require Docker. Browse the **[full examples gallery](https://spring-batch-rs.boussekeyt.dev/quick-examples/)** for tutorials and advanced patterns.
## Documentation
| Resource | Link |
| -------- | ---- |
| Getting Started | [spring-batch-rs.boussekeyt.dev/getting-started](https://spring-batch-rs.boussekeyt.dev/getting-started/) |
| Item Readers & Writers | [spring-batch-rs.boussekeyt.dev/item-readers-writers](https://spring-batch-rs.boussekeyt.dev/item-readers-writers/overview/) |
| API Reference | [docs.rs/spring-batch-rs](https://docs.rs/spring-batch-rs) |
| Architecture | [spring-batch-rs.boussekeyt.dev/architecture](https://spring-batch-rs.boussekeyt.dev/architecture/) |
## Community
- [Discord](https://discord.gg/9FNhawNsG6) — Chat with the community
- [GitHub Issues](https://github.com/sboussekeyt/spring-batch-rs/issues) — Bug reports and feature requests
- [GitHub Discussions](https://github.com/sboussekeyt/spring-batch-rs/discussions) — Questions and ideas
## License
Licensed under [MIT](https://github.com/sboussekeyt/spring-batch-rs/blob/main/LICENSE-MIT) or [Apache-2.0](https://github.com/sboussekeyt/spring-batch-rs/blob/main/LICENSE-APACHE) at your option.
*/
/// Core module for batch operations
/// Error types for batch operations
pub use *;
/// Set of items readers / writers (for exemple: csv reader and writer)
/// Set of tasklets for common batch operations