pub struct CreateTransformJobFluentBuilder { /* private fields */ }
Expand description
Fluent builder constructing a request to CreateTransformJob
.
Starts a transform job. A transform job uses a trained model to get inferences on a dataset and saves these results to an Amazon S3 location that you specify.
To perform batch transformations, you create a transform job and use the data that you have readily available.
In the request body, you provide the following:
-
TransformJobName
- Identifies the transform job. The name must be unique within an Amazon Web Services Region in an Amazon Web Services account. -
ModelName
- Identifies the model to use.ModelName
must be the name of an existing Amazon SageMaker model in the same Amazon Web Services Region and Amazon Web Services account. For information on creating a model, see CreateModel. -
TransformInput
- Describes the dataset to be transformed and the Amazon S3 location where it is stored. -
TransformOutput
- Identifies the Amazon S3 location where you want Amazon SageMaker to save the results from the transform job. -
TransformResources
- Identifies the ML compute instances and AMI image versions for the transform job.
For more information about how batch transformation works, see Batch Transform.
Implementations§
Source§impl CreateTransformJobFluentBuilder
impl CreateTransformJobFluentBuilder
Sourcepub fn as_input(&self) -> &CreateTransformJobInputBuilder
pub fn as_input(&self) -> &CreateTransformJobInputBuilder
Access the CreateTransformJob as a reference.
Sourcepub async fn send(
self,
) -> Result<CreateTransformJobOutput, SdkError<CreateTransformJobError, HttpResponse>>
pub async fn send( self, ) -> Result<CreateTransformJobOutput, SdkError<CreateTransformJobError, HttpResponse>>
Sends the request and returns the response.
If an error occurs, an SdkError
will be returned with additional details that
can be matched against.
By default, any retryable failures will be retried twice. Retry behavior is configurable with the RetryConfig, which can be set when configuring the client.
Sourcepub fn customize(
self,
) -> CustomizableOperation<CreateTransformJobOutput, CreateTransformJobError, Self>
pub fn customize( self, ) -> CustomizableOperation<CreateTransformJobOutput, CreateTransformJobError, Self>
Consumes this builder, creating a customizable operation that can be modified before being sent.
Sourcepub fn transform_job_name(self, input: impl Into<String>) -> Self
pub fn transform_job_name(self, input: impl Into<String>) -> Self
The name of the transform job. The name must be unique within an Amazon Web Services Region in an Amazon Web Services account.
Sourcepub fn set_transform_job_name(self, input: Option<String>) -> Self
pub fn set_transform_job_name(self, input: Option<String>) -> Self
The name of the transform job. The name must be unique within an Amazon Web Services Region in an Amazon Web Services account.
Sourcepub fn get_transform_job_name(&self) -> &Option<String>
pub fn get_transform_job_name(&self) -> &Option<String>
The name of the transform job. The name must be unique within an Amazon Web Services Region in an Amazon Web Services account.
Sourcepub fn model_name(self, input: impl Into<String>) -> Self
pub fn model_name(self, input: impl Into<String>) -> Self
The name of the model that you want to use for the transform job. ModelName
must be the name of an existing Amazon SageMaker model within an Amazon Web Services Region in an Amazon Web Services account.
Sourcepub fn set_model_name(self, input: Option<String>) -> Self
pub fn set_model_name(self, input: Option<String>) -> Self
The name of the model that you want to use for the transform job. ModelName
must be the name of an existing Amazon SageMaker model within an Amazon Web Services Region in an Amazon Web Services account.
Sourcepub fn get_model_name(&self) -> &Option<String>
pub fn get_model_name(&self) -> &Option<String>
The name of the model that you want to use for the transform job. ModelName
must be the name of an existing Amazon SageMaker model within an Amazon Web Services Region in an Amazon Web Services account.
Sourcepub fn max_concurrent_transforms(self, input: i32) -> Self
pub fn max_concurrent_transforms(self, input: i32) -> Self
The maximum number of parallel requests that can be sent to each instance in a transform job. If MaxConcurrentTransforms
is set to 0
or left unset, Amazon SageMaker checks the optional execution-parameters to determine the settings for your chosen algorithm. If the execution-parameters endpoint is not enabled, the default value is 1
. For more information on execution-parameters, see How Containers Serve Requests. For built-in algorithms, you don't need to set a value for MaxConcurrentTransforms
.
Sourcepub fn set_max_concurrent_transforms(self, input: Option<i32>) -> Self
pub fn set_max_concurrent_transforms(self, input: Option<i32>) -> Self
The maximum number of parallel requests that can be sent to each instance in a transform job. If MaxConcurrentTransforms
is set to 0
or left unset, Amazon SageMaker checks the optional execution-parameters to determine the settings for your chosen algorithm. If the execution-parameters endpoint is not enabled, the default value is 1
. For more information on execution-parameters, see How Containers Serve Requests. For built-in algorithms, you don't need to set a value for MaxConcurrentTransforms
.
Sourcepub fn get_max_concurrent_transforms(&self) -> &Option<i32>
pub fn get_max_concurrent_transforms(&self) -> &Option<i32>
The maximum number of parallel requests that can be sent to each instance in a transform job. If MaxConcurrentTransforms
is set to 0
or left unset, Amazon SageMaker checks the optional execution-parameters to determine the settings for your chosen algorithm. If the execution-parameters endpoint is not enabled, the default value is 1
. For more information on execution-parameters, see How Containers Serve Requests. For built-in algorithms, you don't need to set a value for MaxConcurrentTransforms
.
Sourcepub fn model_client_config(self, input: ModelClientConfig) -> Self
pub fn model_client_config(self, input: ModelClientConfig) -> Self
Configures the timeout and maximum number of retries for processing a transform job invocation.
Sourcepub fn set_model_client_config(self, input: Option<ModelClientConfig>) -> Self
pub fn set_model_client_config(self, input: Option<ModelClientConfig>) -> Self
Configures the timeout and maximum number of retries for processing a transform job invocation.
Sourcepub fn get_model_client_config(&self) -> &Option<ModelClientConfig>
pub fn get_model_client_config(&self) -> &Option<ModelClientConfig>
Configures the timeout and maximum number of retries for processing a transform job invocation.
Sourcepub fn max_payload_in_mb(self, input: i32) -> Self
pub fn max_payload_in_mb(self, input: i32) -> Self
The maximum allowed size of the payload, in MB. A payload is the data portion of a record (without metadata). The value in MaxPayloadInMB
must be greater than, or equal to, the size of a single record. To estimate the size of a record in MB, divide the size of your dataset by the number of records. To ensure that the records fit within the maximum payload size, we recommend using a slightly larger value. The default value is 6
MB.
The value of MaxPayloadInMB
cannot be greater than 100 MB. If you specify the MaxConcurrentTransforms
parameter, the value of (MaxConcurrentTransforms * MaxPayloadInMB)
also cannot exceed 100 MB.
For cases where the payload might be arbitrarily large and is transmitted using HTTP chunked encoding, set the value to 0
. This feature works only in supported algorithms. Currently, Amazon SageMaker built-in algorithms do not support HTTP chunked encoding.
Sourcepub fn set_max_payload_in_mb(self, input: Option<i32>) -> Self
pub fn set_max_payload_in_mb(self, input: Option<i32>) -> Self
The maximum allowed size of the payload, in MB. A payload is the data portion of a record (without metadata). The value in MaxPayloadInMB
must be greater than, or equal to, the size of a single record. To estimate the size of a record in MB, divide the size of your dataset by the number of records. To ensure that the records fit within the maximum payload size, we recommend using a slightly larger value. The default value is 6
MB.
The value of MaxPayloadInMB
cannot be greater than 100 MB. If you specify the MaxConcurrentTransforms
parameter, the value of (MaxConcurrentTransforms * MaxPayloadInMB)
also cannot exceed 100 MB.
For cases where the payload might be arbitrarily large and is transmitted using HTTP chunked encoding, set the value to 0
. This feature works only in supported algorithms. Currently, Amazon SageMaker built-in algorithms do not support HTTP chunked encoding.
Sourcepub fn get_max_payload_in_mb(&self) -> &Option<i32>
pub fn get_max_payload_in_mb(&self) -> &Option<i32>
The maximum allowed size of the payload, in MB. A payload is the data portion of a record (without metadata). The value in MaxPayloadInMB
must be greater than, or equal to, the size of a single record. To estimate the size of a record in MB, divide the size of your dataset by the number of records. To ensure that the records fit within the maximum payload size, we recommend using a slightly larger value. The default value is 6
MB.
The value of MaxPayloadInMB
cannot be greater than 100 MB. If you specify the MaxConcurrentTransforms
parameter, the value of (MaxConcurrentTransforms * MaxPayloadInMB)
also cannot exceed 100 MB.
For cases where the payload might be arbitrarily large and is transmitted using HTTP chunked encoding, set the value to 0
. This feature works only in supported algorithms. Currently, Amazon SageMaker built-in algorithms do not support HTTP chunked encoding.
Sourcepub fn batch_strategy(self, input: BatchStrategy) -> Self
pub fn batch_strategy(self, input: BatchStrategy) -> Self
Specifies the number of records to include in a mini-batch for an HTTP inference request. A record is a single unit of input data that inference can be made on. For example, a single line in a CSV file is a record.
To enable the batch strategy, you must set the SplitType
property to Line
, RecordIO
, or TFRecord
.
To use only one record when making an HTTP invocation request to a container, set BatchStrategy
to SingleRecord
and SplitType
to Line
.
To fit as many records in a mini-batch as can fit within the MaxPayloadInMB
limit, set BatchStrategy
to MultiRecord
and SplitType
to Line
.
Sourcepub fn set_batch_strategy(self, input: Option<BatchStrategy>) -> Self
pub fn set_batch_strategy(self, input: Option<BatchStrategy>) -> Self
Specifies the number of records to include in a mini-batch for an HTTP inference request. A record is a single unit of input data that inference can be made on. For example, a single line in a CSV file is a record.
To enable the batch strategy, you must set the SplitType
property to Line
, RecordIO
, or TFRecord
.
To use only one record when making an HTTP invocation request to a container, set BatchStrategy
to SingleRecord
and SplitType
to Line
.
To fit as many records in a mini-batch as can fit within the MaxPayloadInMB
limit, set BatchStrategy
to MultiRecord
and SplitType
to Line
.
Sourcepub fn get_batch_strategy(&self) -> &Option<BatchStrategy>
pub fn get_batch_strategy(&self) -> &Option<BatchStrategy>
Specifies the number of records to include in a mini-batch for an HTTP inference request. A record is a single unit of input data that inference can be made on. For example, a single line in a CSV file is a record.
To enable the batch strategy, you must set the SplitType
property to Line
, RecordIO
, or TFRecord
.
To use only one record when making an HTTP invocation request to a container, set BatchStrategy
to SingleRecord
and SplitType
to Line
.
To fit as many records in a mini-batch as can fit within the MaxPayloadInMB
limit, set BatchStrategy
to MultiRecord
and SplitType
to Line
.
Sourcepub fn environment(self, k: impl Into<String>, v: impl Into<String>) -> Self
pub fn environment(self, k: impl Into<String>, v: impl Into<String>) -> Self
Adds a key-value pair to Environment
.
To override the contents of this collection use set_environment
.
The environment variables to set in the Docker container. Don't include any sensitive data in your environment variables. We support up to 16 key and values entries in the map.
Sourcepub fn set_environment(self, input: Option<HashMap<String, String>>) -> Self
pub fn set_environment(self, input: Option<HashMap<String, String>>) -> Self
The environment variables to set in the Docker container. Don't include any sensitive data in your environment variables. We support up to 16 key and values entries in the map.
Sourcepub fn get_environment(&self) -> &Option<HashMap<String, String>>
pub fn get_environment(&self) -> &Option<HashMap<String, String>>
The environment variables to set in the Docker container. Don't include any sensitive data in your environment variables. We support up to 16 key and values entries in the map.
Sourcepub fn transform_input(self, input: TransformInput) -> Self
pub fn transform_input(self, input: TransformInput) -> Self
Describes the input source and the way the transform job consumes it.
Sourcepub fn set_transform_input(self, input: Option<TransformInput>) -> Self
pub fn set_transform_input(self, input: Option<TransformInput>) -> Self
Describes the input source and the way the transform job consumes it.
Sourcepub fn get_transform_input(&self) -> &Option<TransformInput>
pub fn get_transform_input(&self) -> &Option<TransformInput>
Describes the input source and the way the transform job consumes it.
Sourcepub fn transform_output(self, input: TransformOutput) -> Self
pub fn transform_output(self, input: TransformOutput) -> Self
Describes the results of the transform job.
Sourcepub fn set_transform_output(self, input: Option<TransformOutput>) -> Self
pub fn set_transform_output(self, input: Option<TransformOutput>) -> Self
Describes the results of the transform job.
Sourcepub fn get_transform_output(&self) -> &Option<TransformOutput>
pub fn get_transform_output(&self) -> &Option<TransformOutput>
Describes the results of the transform job.
Sourcepub fn data_capture_config(self, input: BatchDataCaptureConfig) -> Self
pub fn data_capture_config(self, input: BatchDataCaptureConfig) -> Self
Configuration to control how SageMaker captures inference data.
Sourcepub fn set_data_capture_config(
self,
input: Option<BatchDataCaptureConfig>,
) -> Self
pub fn set_data_capture_config( self, input: Option<BatchDataCaptureConfig>, ) -> Self
Configuration to control how SageMaker captures inference data.
Sourcepub fn get_data_capture_config(&self) -> &Option<BatchDataCaptureConfig>
pub fn get_data_capture_config(&self) -> &Option<BatchDataCaptureConfig>
Configuration to control how SageMaker captures inference data.
Sourcepub fn transform_resources(self, input: TransformResources) -> Self
pub fn transform_resources(self, input: TransformResources) -> Self
Describes the resources, including ML instance types and ML instance count, to use for the transform job.
Sourcepub fn set_transform_resources(self, input: Option<TransformResources>) -> Self
pub fn set_transform_resources(self, input: Option<TransformResources>) -> Self
Describes the resources, including ML instance types and ML instance count, to use for the transform job.
Sourcepub fn get_transform_resources(&self) -> &Option<TransformResources>
pub fn get_transform_resources(&self) -> &Option<TransformResources>
Describes the resources, including ML instance types and ML instance count, to use for the transform job.
Sourcepub fn data_processing(self, input: DataProcessing) -> Self
pub fn data_processing(self, input: DataProcessing) -> Self
The data structure used to specify the data to be used for inference in a batch transform job and to associate the data that is relevant to the prediction results in the output. The input filter provided allows you to exclude input data that is not needed for inference in a batch transform job. The output filter provided allows you to include input data relevant to interpreting the predictions in the output from the job. For more information, see Associate Prediction Results with their Corresponding Input Records.
Sourcepub fn set_data_processing(self, input: Option<DataProcessing>) -> Self
pub fn set_data_processing(self, input: Option<DataProcessing>) -> Self
The data structure used to specify the data to be used for inference in a batch transform job and to associate the data that is relevant to the prediction results in the output. The input filter provided allows you to exclude input data that is not needed for inference in a batch transform job. The output filter provided allows you to include input data relevant to interpreting the predictions in the output from the job. For more information, see Associate Prediction Results with their Corresponding Input Records.
Sourcepub fn get_data_processing(&self) -> &Option<DataProcessing>
pub fn get_data_processing(&self) -> &Option<DataProcessing>
The data structure used to specify the data to be used for inference in a batch transform job and to associate the data that is relevant to the prediction results in the output. The input filter provided allows you to exclude input data that is not needed for inference in a batch transform job. The output filter provided allows you to include input data relevant to interpreting the predictions in the output from the job. For more information, see Associate Prediction Results with their Corresponding Input Records.
Appends an item to Tags
.
To override the contents of this collection use set_tags
.
(Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags in the Amazon Web Services Billing and Cost Management User Guide.
(Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags in the Amazon Web Services Billing and Cost Management User Guide.
(Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags in the Amazon Web Services Billing and Cost Management User Guide.
Sourcepub fn experiment_config(self, input: ExperimentConfig) -> Self
pub fn experiment_config(self, input: ExperimentConfig) -> Self
Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the following APIs:
Sourcepub fn set_experiment_config(self, input: Option<ExperimentConfig>) -> Self
pub fn set_experiment_config(self, input: Option<ExperimentConfig>) -> Self
Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the following APIs:
Sourcepub fn get_experiment_config(&self) -> &Option<ExperimentConfig>
pub fn get_experiment_config(&self) -> &Option<ExperimentConfig>
Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the following APIs:
Trait Implementations§
Source§impl Clone for CreateTransformJobFluentBuilder
impl Clone for CreateTransformJobFluentBuilder
Source§fn clone(&self) -> CreateTransformJobFluentBuilder
fn clone(&self) -> CreateTransformJobFluentBuilder
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moreAuto Trait Implementations§
impl Freeze for CreateTransformJobFluentBuilder
impl !RefUnwindSafe for CreateTransformJobFluentBuilder
impl Send for CreateTransformJobFluentBuilder
impl Sync for CreateTransformJobFluentBuilder
impl Unpin for CreateTransformJobFluentBuilder
impl !UnwindSafe for CreateTransformJobFluentBuilder
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the foreground set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red()
and
green()
, which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg()
:
use yansi::{Paint, Color};
painted.fg(Color::White);
Set foreground color to white using white()
.
use yansi::Paint;
painted.white();
Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the background set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red()
and
on_green()
, which have the same functionality but
are pithier.
§Example
Set background color to red using fg()
:
use yansi::{Paint, Color};
painted.bg(Color::Red);
Set background color to red using on_red()
.
use yansi::Paint;
painted.on_red();
Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute
value
.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold()
and
underline()
, which have the same functionality
but are pithier.
§Example
Make text bold using attr()
:
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);
Make text bold using using bold()
.
use yansi::Paint;
painted.bold();
Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi
Quirk
value
.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask()
and
wrap()
, which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk()
:
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);
Enable wrapping using wrap()
.
use yansi::Paint;
painted.wrap();
Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition
value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted
only when both stdout
and stderr
are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);