pub async fn estimate_database_sizes(
source_url: &str,
source_client: &Client,
databases: &[DatabaseInfo],
filter: &ReplicationFilter,
) -> Result<Vec<DatabaseSizeInfo>>Expand description
Estimate database sizes and replication times with filtering support
Queries PostgreSQL for database sizes and calculates estimated replication times based on typical dump/restore speeds. Uses a conservative estimate of 20 GB/hour for total replication time (dump + restore).
When table filters are specified, connects to each database to compute the exact size of filtered tables rather than using the entire database size.
§Arguments
source_url- Connection URL for the source database clustersource_client- Connected PostgreSQL client to source databasedatabases- List of databases to estimatefilter- Replication filter for table inclusion/exclusion
§Returns
Returns a vector of DatabaseSizeInfo with size and time estimates for each database.
§Errors
This function will return an error if:
- Cannot query database sizes
- Database connection fails
- Cannot connect to individual databases for table filtering
§Examples
let url = "postgresql://user:pass@localhost:5432/postgres";
let client = connect(url).await?;
let databases = list_databases(&client).await?;
let filter = ReplicationFilter::empty();
let estimates = estimate_database_sizes(url, &client, &databases, &filter).await?;
for estimate in estimates {
println!("{}: {} (~{:?})", estimate.name, estimate.size_human, estimate.estimated_duration);
}