pub async fn insert_jsonb_batch(
client: &Client,
table_name: &str,
rows: Vec<(String, Value)>,
source_type: &str,
) -> Result<()>Expand description
Insert multiple JSONB rows with adaptive batching
Inserts multiple rows efficiently using multi-value INSERT statements. Automatically adjusts batch size based on row payload sizes and retries with smaller batches on connection failures.
§Arguments
client- PostgreSQL client connectiontable_name- Name of the table (must be validated)rows- Vector of (id, data) tuplessource_type- Source database type (‘sqlite’, ‘mongodb’, or ‘mysql’)
§Security
Uses parameterized queries for all data. table_name must be validated.
§Performance
- Dynamically calculates batch size based on estimated payload size
- Targets ~10MB per batch for optimal throughput
- Automatically retries with smaller batches on failure
- Shows progress for large datasets
§Examples
let table_name = "users";
validate_table_name(table_name)?;
let rows = vec![
("1".to_string(), json!({"name": "Alice", "age": 30})),
("2".to_string(), json!({"name": "Bob", "age": 25})),
];
insert_jsonb_batch(client, table_name, rows, "sqlite").await?;