pub struct Builder { /* private fields */ }Expand description
A builder for StartProjectVersionInput.
Implementations
sourceimpl Builder
impl Builder
sourcepub fn project_version_arn(self, input: impl Into<String>) -> Self
pub fn project_version_arn(self, input: impl Into<String>) -> Self
The Amazon Resource Name(ARN) of the model version that you want to start.
sourcepub fn set_project_version_arn(self, input: Option<String>) -> Self
pub fn set_project_version_arn(self, input: Option<String>) -> Self
The Amazon Resource Name(ARN) of the model version that you want to start.
sourcepub fn min_inference_units(self, input: i32) -> Self
pub fn min_inference_units(self, input: i32) -> Self
The minimum number of inference units to use. A single inference unit represents 1 hour of processing.
For information about the number of transactions per second (TPS) that an inference unit can support, see Running a trained Amazon Rekognition Custom Labels model in the Amazon Rekognition Custom Labels Guide.
Use a higher number to increase the TPS throughput of your model. You are charged for the number of inference units that you use.
sourcepub fn set_min_inference_units(self, input: Option<i32>) -> Self
pub fn set_min_inference_units(self, input: Option<i32>) -> Self
The minimum number of inference units to use. A single inference unit represents 1 hour of processing.
For information about the number of transactions per second (TPS) that an inference unit can support, see Running a trained Amazon Rekognition Custom Labels model in the Amazon Rekognition Custom Labels Guide.
Use a higher number to increase the TPS throughput of your model. You are charged for the number of inference units that you use.
sourcepub fn max_inference_units(self, input: i32) -> Self
pub fn max_inference_units(self, input: i32) -> Self
The maximum number of inference units to use for auto-scaling the model. If you don't specify a value, Amazon Rekognition Custom Labels doesn't auto-scale the model.
sourcepub fn set_max_inference_units(self, input: Option<i32>) -> Self
pub fn set_max_inference_units(self, input: Option<i32>) -> Self
The maximum number of inference units to use for auto-scaling the model. If you don't specify a value, Amazon Rekognition Custom Labels doesn't auto-scale the model.
sourcepub fn build(self) -> Result<StartProjectVersionInput, BuildError>
pub fn build(self) -> Result<StartProjectVersionInput, BuildError>
Consumes the builder and constructs a StartProjectVersionInput.