Struct aws_sdk_sagemaker::model::container_definition::Builder
source · [−]#[non_exhaustive]pub struct Builder { /* private fields */ }
Expand description
A builder for ContainerDefinition
Implementations
sourceimpl Builder
impl Builder
sourcepub fn container_hostname(self, input: impl Into<String>) -> Self
pub fn container_hostname(self, input: impl Into<String>) -> Self
This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a ContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned based on the position of the ContainerDefinition
in the pipeline. If you specify a value for the ContainerHostName
for any ContainerDefinition
that is part of an inference pipeline, you must specify a value for the ContainerHostName
parameter of every ContainerDefinition
in that pipeline.
sourcepub fn set_container_hostname(self, input: Option<String>) -> Self
pub fn set_container_hostname(self, input: Option<String>) -> Self
This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a ContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned based on the position of the ContainerDefinition
in the pipeline. If you specify a value for the ContainerHostName
for any ContainerDefinition
that is part of an inference pipeline, you must specify a value for the ContainerHostName
parameter of every ContainerDefinition
in that pipeline.
sourcepub fn image(self, input: impl Into<String>) -> Self
pub fn image(self, input: impl Into<String>) -> Self
The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet Amazon SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker
sourcepub fn set_image(self, input: Option<String>) -> Self
pub fn set_image(self, input: Option<String>) -> Self
The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet Amazon SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker
sourcepub fn image_config(self, input: ImageConfig) -> Self
pub fn image_config(self, input: ImageConfig) -> Self
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers
sourcepub fn set_image_config(self, input: Option<ImageConfig>) -> Self
pub fn set_image_config(self, input: Option<ImageConfig>) -> Self
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers
sourcepub fn mode(self, input: ContainerMode) -> Self
pub fn mode(self, input: ContainerMode) -> Self
Whether the container hosts a single model or multiple models.
sourcepub fn set_mode(self, input: Option<ContainerMode>) -> Self
pub fn set_mode(self, input: Option<ContainerMode>) -> Self
Whether the container hosts a single model or multiple models.
sourcepub fn model_data_url(self, input: impl Into<String>) -> Self
pub fn model_data_url(self, input: impl Into<String>) -> Self
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.
If you provide a value for this parameter, Amazon SageMaker uses Amazon Web Services Security Token Service to download model artifacts from the S3 path you provide. Amazon Web Services STS is activated in your IAM user account by default. If you previously deactivated Amazon Web Services STS for a region, you need to reactivate Amazon Web Services STS for that region. For more information, see Activating and Deactivating Amazon Web Services STS in an Amazon Web Services Region in the Amazon Web Services Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to the model artifacts in ModelDataUrl
.
sourcepub fn set_model_data_url(self, input: Option<String>) -> Self
pub fn set_model_data_url(self, input: Option<String>) -> Self
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.
If you provide a value for this parameter, Amazon SageMaker uses Amazon Web Services Security Token Service to download model artifacts from the S3 path you provide. Amazon Web Services STS is activated in your IAM user account by default. If you previously deactivated Amazon Web Services STS for a region, you need to reactivate Amazon Web Services STS for that region. For more information, see Activating and Deactivating Amazon Web Services STS in an Amazon Web Services Region in the Amazon Web Services Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to the model artifacts in ModelDataUrl
.
sourcepub fn environment(self, k: impl Into<String>, v: impl Into<String>) -> Self
pub fn environment(self, k: impl Into<String>, v: impl Into<String>) -> Self
Adds a key-value pair to environment
.
To override the contents of this collection use set_environment
.
The environment variables to set in the Docker container. Each key and value in the Environment
string to string map can have length of up to 1024. We support up to 16 entries in the map.
sourcepub fn set_environment(self, input: Option<HashMap<String, String>>) -> Self
pub fn set_environment(self, input: Option<HashMap<String, String>>) -> Self
The environment variables to set in the Docker container. Each key and value in the Environment
string to string map can have length of up to 1024. We support up to 16 entries in the map.
sourcepub fn model_package_name(self, input: impl Into<String>) -> Self
pub fn model_package_name(self, input: impl Into<String>) -> Self
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
sourcepub fn set_model_package_name(self, input: Option<String>) -> Self
pub fn set_model_package_name(self, input: Option<String>) -> Self
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
sourcepub fn inference_specification_name(self, input: impl Into<String>) -> Self
pub fn inference_specification_name(self, input: impl Into<String>) -> Self
The inference specification name in the model package version.
sourcepub fn set_inference_specification_name(self, input: Option<String>) -> Self
pub fn set_inference_specification_name(self, input: Option<String>) -> Self
The inference specification name in the model package version.
sourcepub fn multi_model_config(self, input: MultiModelConfig) -> Self
pub fn multi_model_config(self, input: MultiModelConfig) -> Self
Specifies additional configuration for multi-model endpoints.
sourcepub fn set_multi_model_config(self, input: Option<MultiModelConfig>) -> Self
pub fn set_multi_model_config(self, input: Option<MultiModelConfig>) -> Self
Specifies additional configuration for multi-model endpoints.
sourcepub fn build(self) -> ContainerDefinition
pub fn build(self) -> ContainerDefinition
Consumes the builder and constructs a ContainerDefinition
Trait Implementations
impl StructuralPartialEq for Builder
Auto Trait Implementations
impl RefUnwindSafe for Builder
impl Send for Builder
impl Sync for Builder
impl Unpin for Builder
impl UnwindSafe for Builder
Blanket Implementations
sourceimpl<T> BorrowMut<T> for T where
T: ?Sized,
impl<T> BorrowMut<T> for T where
T: ?Sized,
const: unstable · sourcepub fn borrow_mut(&mut self) -> &mut T
pub fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more
sourceimpl<T> Instrument for T
impl<T> Instrument for T
sourcefn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
sourcefn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
sourceimpl<T> ToOwned for T where
T: Clone,
impl<T> ToOwned for T where
T: Clone,
type Owned = T
type Owned = T
The resulting type after obtaining ownership.
sourcepub fn to_owned(&self) -> T
pub fn to_owned(&self) -> T
Creates owned data from borrowed data, usually by cloning. Read more
sourcepub fn clone_into(&self, target: &mut T)
pub fn clone_into(&self, target: &mut T)
toowned_clone_into
)Uses borrowed data to replace owned data, usually by cloning. Read more
sourceimpl<T> WithSubscriber for T
impl<T> WithSubscriber for T
sourcefn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self> where
S: Into<Dispatch>,
fn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self> where
S: Into<Dispatch>,
Attaches the provided Subscriber
to this type, returning a
WithDispatch
wrapper. Read more
sourcefn with_current_subscriber(self) -> WithDispatch<Self>
fn with_current_subscriber(self) -> WithDispatch<Self>
Attaches the current default Subscriber
to this type, returning a
WithDispatch
wrapper. Read more