[][src]Crate s3_algo

S3 high-performance algorithms

High-performance algorithms for batch operations in Amazon S3.

https://docs.aws.amazon.com/AmazonS3/latest/dev/optimizing-performance-guidelines.html

  • Upload multiple files with S3Algo::upload_files.
  • List files with S3Algo::s3_list_objects or S3Algo::s3_list_prefix, and then execute deletion or copy on all the files.

Re-exports

pub use err::Error;

Modules

err
timeout

The Timeout trait defines the how the timeout value of a multi-file upload evolves based on past file upload results. A default implementation TimeoutState is provided.

Structs

Config
ListObjects

A stream that can list objects, and (using member functions) delete or copy listed files.

RequestConfig

General parameters that control timeouts and retries

RequestReport

Result of a single S3 request.

S3Algo

Enums

ObjectSource

Functions

files_recursive

Convenience function (using walkdir) to traverse all files in directory src_dir. Returns an iterator that can be used as input to S3Algo::upload_files, which uploads files with a key equal to the file's path with src_dir stripped away, and with key_prefix prepended.

s3_single_request

Issue a single S3 request, with retries and appropriate timeouts using sane defaults. Basically an easier, less general version of s3_request. size should be given if the operation is on a file with a certain size. This is necessary in order to set a suitable timeout to the request. For example, copy or put operations are linear in the size of the files operated on.

testing_s3_client

S3 client for testing - assumes local minio on port 9000 and an existing credentials profile called testing

Type Definitions

ListObjectsV2Result