Struct aws_sdk_sagemaker::model::ProcessingS3Input
source · #[non_exhaustive]pub struct ProcessingS3Input { /* private fields */ }
Expand description
Configuration for downloading input data from Amazon S3 into the processing container.
Implementations§
source§impl ProcessingS3Input
impl ProcessingS3Input
sourcepub fn s3_uri(&self) -> Option<&str>
pub fn s3_uri(&self) -> Option<&str>
The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
sourcepub fn local_path(&self) -> Option<&str>
pub fn local_path(&self) -> Option<&str>
The local path in your container where you want Amazon SageMaker to write input data to. LocalPath
is an absolute path to the input data and must begin with /opt/ml/processing/
. LocalPath
is a required parameter when AppManaged
is False
(default).
sourcepub fn s3_data_type(&self) -> Option<&ProcessingS3DataType>
pub fn s3_data_type(&self) -> Option<&ProcessingS3DataType>
Whether you use an S3Prefix
or a ManifestFile
for the data type. If you choose S3Prefix
, S3Uri
identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile
, S3Uri
identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
sourcepub fn s3_input_mode(&self) -> Option<&ProcessingS3InputMode>
pub fn s3_input_mode(&self) -> Option<&ProcessingS3InputMode>
Whether to use File
or Pipe
input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe
mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
sourcepub fn s3_data_distribution_type(
&self
) -> Option<&ProcessingS3DataDistributionType>
pub fn s3_data_distribution_type(
&self
) -> Option<&ProcessingS3DataDistributionType>
Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated
, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
sourcepub fn s3_compression_type(&self) -> Option<&ProcessingS3CompressionType>
pub fn s3_compression_type(&self) -> Option<&ProcessingS3CompressionType>
Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip
can only be used when Pipe
mode is specified as the S3InputMode
. In Pipe
mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
source§impl ProcessingS3Input
impl ProcessingS3Input
sourcepub fn builder() -> Builder
pub fn builder() -> Builder
Creates a new builder-style object to manufacture ProcessingS3Input
.
Trait Implementations§
source§impl Clone for ProcessingS3Input
impl Clone for ProcessingS3Input
source§fn clone(&self) -> ProcessingS3Input
fn clone(&self) -> ProcessingS3Input
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moresource§impl Debug for ProcessingS3Input
impl Debug for ProcessingS3Input
source§impl PartialEq<ProcessingS3Input> for ProcessingS3Input
impl PartialEq<ProcessingS3Input> for ProcessingS3Input
source§fn eq(&self, other: &ProcessingS3Input) -> bool
fn eq(&self, other: &ProcessingS3Input) -> bool
self
and other
values to be equal, and is used
by ==
.