Crate s3_algo

Source
Expand description

§S3 high-performance algorithms

High-performance algorithms for batch operations in Amazon S3.

https://docs.aws.amazon.com/AmazonS3/latest/dev/optimizing-performance-guidelines.html

  • Upload multiple files with S3Algo::upload_files.
  • List files with S3Algo::s3_list_objects or S3Algo::s3_list_prefix, and then execute deletion or copy on all the files.

Re-exports§

pub use err::Error;

Modules§

err
timeout
The Timeout trait defines the how the timeout value of a multi-file upload evolves based on past file upload results. A default implementation TimeoutState is provided.

Structs§

AlgorithmConfig
Config
ListObjects
A stream that can list objects, and (using member functions) delete or copy listed files.
RequestReport
Result of a single S3 request.
S3Algo
SpecificTimings
These settings are specific to the kind of operation we do. For example delete or put in S3.

Enums§

ObjectSource

Functions§

files_recursive
Convenience function (using walkdir) to traverse all files in directory src_dir. Returns an iterator that can be used as input to S3Algo::upload_files, which uploads files with a key equal to the file’s path with src_dir stripped away, and with key_prefix prepended.
retriable_s3_client
s3_single_request
Issue a single S3 request, with retries and appropriate timeouts using sane defaults. Basically an easier, less general version of s3_request.
testing_sdk_client