Expand description
§S3 high-performance algorithms
High-performance algorithms for batch operations in Amazon S3.
https://docs.aws.amazon.com/AmazonS3/latest/dev/optimizing-performance-guidelines.html
- Upload multiple files with
S3Algo::upload_files
. - List files with
S3Algo::s3_list_objects
orS3Algo::s3_list_prefix
, and then execute deletion or copy on all the files.
Re-exports§
pub use err::Error;
Modules§
- err
- timeout
- The
Timeout
trait defines the how the timeout value of a multi-file upload evolves based on past file upload results. A default implementationTimeoutState
is provided.
Structs§
- Algorithm
Config - Config
- List
Objects - A stream that can list objects, and (using member functions) delete or copy listed files.
- Request
Report - Result of a single S3 request.
- S3Algo
- Specific
Timings - These settings are specific to the kind of operation we do. For example delete or put in S3.
Enums§
Functions§
- files_
recursive - Convenience function (using
walkdir
) to traverse all files in directorysrc_dir
. Returns an iterator that can be used as input toS3Algo::upload_files
, which uploads files with a key equal to the file’s path withsrc_dir
stripped away, and withkey_prefix
prepended. - retriable_
s3_ client - s3_
single_ request - Issue a single S3 request, with retries and appropriate timeouts using sane defaults.
Basically an easier, less general version of
s3_request
. - testing_
sdk_ client