pub async fn convert_table_batched(
sqlite_conn: &Connection,
pg_client: &Client,
table: &str,
source_type: &str,
batch_size: Option<usize>,
) -> Result<usize>Expand description
Convert and insert a SQLite table to PostgreSQL using batched processing.
This function uses memory-efficient batched processing to handle large tables:
- Reads rows in batches (default 10,000 rows)
- Converts each batch to JSONB format
- Inserts each batch to PostgreSQL before reading the next
Memory usage stays constant regardless of table size.
§Arguments
sqlite_conn- SQLite database connectionpg_client- PostgreSQL client connectiontable- Table name to convertsource_type- Source type label for metadata (e.g., “sqlite”)batch_size- Optional batch size (default: 10,000 rows)
§Returns
Total number of rows processed.
§Examples
let rows_processed = convert_table_batched(
sqlite_conn,
pg_client,
"large_table",
"sqlite",
None,
).await?;
println!("Processed {} rows", rows_processed);