convert_table_batched

Function convert_table_batched 

Source
pub async fn convert_table_batched(
    sqlite_conn: &Connection,
    pg_client: &Client,
    table: &str,
    source_type: &str,
    batch_size: Option<usize>,
) -> Result<usize>
Expand description

Convert and insert a SQLite table to PostgreSQL using batched processing.

This function uses memory-efficient batched processing to handle large tables:

  1. Reads rows in batches (default 10,000 rows)
  2. Converts each batch to JSONB format
  3. Inserts each batch to PostgreSQL before reading the next

Memory usage stays constant regardless of table size.

§Arguments

  • sqlite_conn - SQLite database connection
  • pg_client - PostgreSQL client connection
  • table - Table name to convert
  • source_type - Source type label for metadata (e.g., “sqlite”)
  • batch_size - Optional batch size (default: 10,000 rows)

§Returns

Total number of rows processed.

§Examples

let rows_processed = convert_table_batched(
    sqlite_conn,
    pg_client,
    "large_table",
    "sqlite",
    None,
).await?;
println!("Processed {} rows", rows_processed);