Struct aws_sdk_personalize::operation::create_batch_inference_job::builders::CreateBatchInferenceJobFluentBuilder
source · pub struct CreateBatchInferenceJobFluentBuilder { /* private fields */ }
Expand description
Fluent builder constructing a request to CreateBatchInferenceJob
.
Generates batch recommendations based on a list of items or users stored in Amazon S3 and exports the recommendations to an Amazon S3 bucket.
To generate batch recommendations, specify the ARN of a solution version and an Amazon S3 URI for the input and output data. For user personalization, popular items, and personalized ranking solutions, the batch inference job generates a list of recommended items for each user ID in the input file. For related items solutions, the job generates a list of recommended items for each item ID in the input file.
For more information, see Creating a batch inference job .
If you use the Similar-Items recipe, Amazon Personalize can add descriptive themes to batch recommendations. To generate themes, set the job's mode to THEME_GENERATION
and specify the name of the field that contains item names in the input data.
For more information about generating themes, see Batch recommendations with themes from Content Generator .
You can't get batch recommendations with the Trending-Now or Next-Best-Action recipes.
Implementations§
source§impl CreateBatchInferenceJobFluentBuilder
impl CreateBatchInferenceJobFluentBuilder
sourcepub fn as_input(&self) -> &CreateBatchInferenceJobInputBuilder
pub fn as_input(&self) -> &CreateBatchInferenceJobInputBuilder
Access the CreateBatchInferenceJob as a reference.
sourcepub async fn send(
self
) -> Result<CreateBatchInferenceJobOutput, SdkError<CreateBatchInferenceJobError, HttpResponse>>
pub async fn send( self ) -> Result<CreateBatchInferenceJobOutput, SdkError<CreateBatchInferenceJobError, HttpResponse>>
Sends the request and returns the response.
If an error occurs, an SdkError
will be returned with additional details that
can be matched against.
By default, any retryable failures will be retried twice. Retry behavior is configurable with the RetryConfig, which can be set when configuring the client.
sourcepub fn customize(
self
) -> CustomizableOperation<CreateBatchInferenceJobOutput, CreateBatchInferenceJobError, Self>
pub fn customize( self ) -> CustomizableOperation<CreateBatchInferenceJobOutput, CreateBatchInferenceJobError, Self>
Consumes this builder, creating a customizable operation that can be modified before being sent.
sourcepub fn job_name(self, input: impl Into<String>) -> Self
pub fn job_name(self, input: impl Into<String>) -> Self
The name of the batch inference job to create.
sourcepub fn set_job_name(self, input: Option<String>) -> Self
pub fn set_job_name(self, input: Option<String>) -> Self
The name of the batch inference job to create.
sourcepub fn get_job_name(&self) -> &Option<String>
pub fn get_job_name(&self) -> &Option<String>
The name of the batch inference job to create.
sourcepub fn solution_version_arn(self, input: impl Into<String>) -> Self
pub fn solution_version_arn(self, input: impl Into<String>) -> Self
The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
sourcepub fn set_solution_version_arn(self, input: Option<String>) -> Self
pub fn set_solution_version_arn(self, input: Option<String>) -> Self
The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
sourcepub fn get_solution_version_arn(&self) -> &Option<String>
pub fn get_solution_version_arn(&self) -> &Option<String>
The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
sourcepub fn filter_arn(self, input: impl Into<String>) -> Self
pub fn filter_arn(self, input: impl Into<String>) -> Self
The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering batch recommendations.
sourcepub fn set_filter_arn(self, input: Option<String>) -> Self
pub fn set_filter_arn(self, input: Option<String>) -> Self
The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering batch recommendations.
sourcepub fn get_filter_arn(&self) -> &Option<String>
pub fn get_filter_arn(&self) -> &Option<String>
The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering batch recommendations.
sourcepub fn num_results(self, input: i32) -> Self
pub fn num_results(self, input: i32) -> Self
The number of recommendations to retrieve.
sourcepub fn set_num_results(self, input: Option<i32>) -> Self
pub fn set_num_results(self, input: Option<i32>) -> Self
The number of recommendations to retrieve.
sourcepub fn get_num_results(&self) -> &Option<i32>
pub fn get_num_results(&self) -> &Option<i32>
The number of recommendations to retrieve.
sourcepub fn job_input(self, input: BatchInferenceJobInput) -> Self
pub fn job_input(self, input: BatchInferenceJobInput) -> Self
The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
sourcepub fn set_job_input(self, input: Option<BatchInferenceJobInput>) -> Self
pub fn set_job_input(self, input: Option<BatchInferenceJobInput>) -> Self
The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
sourcepub fn get_job_input(&self) -> &Option<BatchInferenceJobInput>
pub fn get_job_input(&self) -> &Option<BatchInferenceJobInput>
The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
sourcepub fn job_output(self, input: BatchInferenceJobOutput) -> Self
pub fn job_output(self, input: BatchInferenceJobOutput) -> Self
The path to the Amazon S3 bucket where the job's output will be stored.
sourcepub fn set_job_output(self, input: Option<BatchInferenceJobOutput>) -> Self
pub fn set_job_output(self, input: Option<BatchInferenceJobOutput>) -> Self
The path to the Amazon S3 bucket where the job's output will be stored.
sourcepub fn get_job_output(&self) -> &Option<BatchInferenceJobOutput>
pub fn get_job_output(&self) -> &Option<BatchInferenceJobOutput>
The path to the Amazon S3 bucket where the job's output will be stored.
sourcepub fn role_arn(self, input: impl Into<String>) -> Self
pub fn role_arn(self, input: impl Into<String>) -> Self
The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
sourcepub fn set_role_arn(self, input: Option<String>) -> Self
pub fn set_role_arn(self, input: Option<String>) -> Self
The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
sourcepub fn get_role_arn(&self) -> &Option<String>
pub fn get_role_arn(&self) -> &Option<String>
The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
sourcepub fn batch_inference_job_config(self, input: BatchInferenceJobConfig) -> Self
pub fn batch_inference_job_config(self, input: BatchInferenceJobConfig) -> Self
The configuration details of a batch inference job.
sourcepub fn set_batch_inference_job_config(
self,
input: Option<BatchInferenceJobConfig>
) -> Self
pub fn set_batch_inference_job_config( self, input: Option<BatchInferenceJobConfig> ) -> Self
The configuration details of a batch inference job.
sourcepub fn get_batch_inference_job_config(&self) -> &Option<BatchInferenceJobConfig>
pub fn get_batch_inference_job_config(&self) -> &Option<BatchInferenceJobConfig>
The configuration details of a batch inference job.
A list of tags to apply to the batch inference job.
A list of tags to apply to the batch inference job.
sourcepub fn batch_inference_job_mode(self, input: BatchInferenceJobMode) -> Self
pub fn batch_inference_job_mode(self, input: BatchInferenceJobMode) -> Self
The mode of the batch inference job. To generate descriptive themes for groups of similar items, set the job mode to THEME_GENERATION
. If you don't want to generate themes, use the default BATCH_INFERENCE
.
When you get batch recommendations with themes, you will incur additional costs. For more information, see Amazon Personalize pricing.
sourcepub fn set_batch_inference_job_mode(
self,
input: Option<BatchInferenceJobMode>
) -> Self
pub fn set_batch_inference_job_mode( self, input: Option<BatchInferenceJobMode> ) -> Self
The mode of the batch inference job. To generate descriptive themes for groups of similar items, set the job mode to THEME_GENERATION
. If you don't want to generate themes, use the default BATCH_INFERENCE
.
When you get batch recommendations with themes, you will incur additional costs. For more information, see Amazon Personalize pricing.
sourcepub fn get_batch_inference_job_mode(&self) -> &Option<BatchInferenceJobMode>
pub fn get_batch_inference_job_mode(&self) -> &Option<BatchInferenceJobMode>
The mode of the batch inference job. To generate descriptive themes for groups of similar items, set the job mode to THEME_GENERATION
. If you don't want to generate themes, use the default BATCH_INFERENCE
.
When you get batch recommendations with themes, you will incur additional costs. For more information, see Amazon Personalize pricing.
sourcepub fn theme_generation_config(self, input: ThemeGenerationConfig) -> Self
pub fn theme_generation_config(self, input: ThemeGenerationConfig) -> Self
For theme generation jobs, specify the name of the column in your Items dataset that contains each item's name.
sourcepub fn set_theme_generation_config(
self,
input: Option<ThemeGenerationConfig>
) -> Self
pub fn set_theme_generation_config( self, input: Option<ThemeGenerationConfig> ) -> Self
For theme generation jobs, specify the name of the column in your Items dataset that contains each item's name.
sourcepub fn get_theme_generation_config(&self) -> &Option<ThemeGenerationConfig>
pub fn get_theme_generation_config(&self) -> &Option<ThemeGenerationConfig>
For theme generation jobs, specify the name of the column in your Items dataset that contains each item's name.
Trait Implementations§
source§impl Clone for CreateBatchInferenceJobFluentBuilder
impl Clone for CreateBatchInferenceJobFluentBuilder
source§fn clone(&self) -> CreateBatchInferenceJobFluentBuilder
fn clone(&self) -> CreateBatchInferenceJobFluentBuilder
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read more