Expand description
S3 high-performance algorithms
High-performance algorithms for batch operations in Amazon S3.
https://docs.aws.amazon.com/AmazonS3/latest/dev/optimizing-performance-guidelines.html
- Upload multiple files with
S3Algo::upload_files. - List files with
S3Algo::s3_list_objectsorS3Algo::s3_list_prefix, and then execute deletion or copy on all the files.
Re-exports
pub use err::Error;Modules
The
Timeout trait defines the how the timeout value of a multi-file upload evolves based on
past file upload results. A default implementation TimeoutState is provided.Structs
A stream that can list objects, and (using member functions) delete or copy listed files.
Result of a single S3 request.
These settings are specific to the kind of operation we do. For example delete or put in S3.
Enums
Functions
Convenience function (using
walkdir) to traverse all files in directory src_dir. Returns an
iterator that can be used as input to S3Algo::upload_files, which uploads files
with a key equal to the file’s path with src_dir stripped away, and with key_prefix
prepended.Issue a single S3 request, with retries and appropriate timeouts using sane defaults.
Basically an easier, less general version of
s3_request.S3 client for testing - assumes local minio on port 9000 and an existing credentials profile
called
testing