Struct aws_sdk_personalize::operation::create_batch_inference_job::CreateBatchInferenceJobInput
source · #[non_exhaustive]pub struct CreateBatchInferenceJobInput {
pub job_name: Option<String>,
pub solution_version_arn: Option<String>,
pub filter_arn: Option<String>,
pub num_results: Option<i32>,
pub job_input: Option<BatchInferenceJobInput>,
pub job_output: Option<BatchInferenceJobOutput>,
pub role_arn: Option<String>,
pub batch_inference_job_config: Option<BatchInferenceJobConfig>,
pub tags: Option<Vec<Tag>>,
pub batch_inference_job_mode: Option<BatchInferenceJobMode>,
pub theme_generation_config: Option<ThemeGenerationConfig>,
}Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.job_name: Option<String>The name of the batch inference job to create.
solution_version_arn: Option<String>The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
filter_arn: Option<String>The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering batch recommendations.
num_results: Option<i32>The number of recommendations to retrieve.
job_input: Option<BatchInferenceJobInput>The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
job_output: Option<BatchInferenceJobOutput>The path to the Amazon S3 bucket where the job's output will be stored.
role_arn: Option<String>The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
batch_inference_job_config: Option<BatchInferenceJobConfig>The configuration details of a batch inference job.
A list of tags to apply to the batch inference job.
batch_inference_job_mode: Option<BatchInferenceJobMode>The mode of the batch inference job. To generate descriptive themes for groups of similar items, set the job mode to THEME_GENERATION. If you don't want to generate themes, use the default BATCH_INFERENCE.
When you get batch recommendations with themes, you will incur additional costs. For more information, see Amazon Personalize pricing.
theme_generation_config: Option<ThemeGenerationConfig>For theme generation jobs, specify the name of the column in your Items dataset that contains each item's name.
Implementations§
source§impl CreateBatchInferenceJobInput
impl CreateBatchInferenceJobInput
sourcepub fn solution_version_arn(&self) -> Option<&str>
pub fn solution_version_arn(&self) -> Option<&str>
The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
sourcepub fn filter_arn(&self) -> Option<&str>
pub fn filter_arn(&self) -> Option<&str>
The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering batch recommendations.
sourcepub fn num_results(&self) -> Option<i32>
pub fn num_results(&self) -> Option<i32>
The number of recommendations to retrieve.
sourcepub fn job_input(&self) -> Option<&BatchInferenceJobInput>
pub fn job_input(&self) -> Option<&BatchInferenceJobInput>
The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
sourcepub fn job_output(&self) -> Option<&BatchInferenceJobOutput>
pub fn job_output(&self) -> Option<&BatchInferenceJobOutput>
The path to the Amazon S3 bucket where the job's output will be stored.
sourcepub fn role_arn(&self) -> Option<&str>
pub fn role_arn(&self) -> Option<&str>
The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
sourcepub fn batch_inference_job_config(&self) -> Option<&BatchInferenceJobConfig>
pub fn batch_inference_job_config(&self) -> Option<&BatchInferenceJobConfig>
The configuration details of a batch inference job.
A list of tags to apply to the batch inference job.
If no value was sent for this field, a default will be set. If you want to determine if no value was sent, use .tags.is_none().
sourcepub fn batch_inference_job_mode(&self) -> Option<&BatchInferenceJobMode>
pub fn batch_inference_job_mode(&self) -> Option<&BatchInferenceJobMode>
The mode of the batch inference job. To generate descriptive themes for groups of similar items, set the job mode to THEME_GENERATION. If you don't want to generate themes, use the default BATCH_INFERENCE.
When you get batch recommendations with themes, you will incur additional costs. For more information, see Amazon Personalize pricing.
sourcepub fn theme_generation_config(&self) -> Option<&ThemeGenerationConfig>
pub fn theme_generation_config(&self) -> Option<&ThemeGenerationConfig>
For theme generation jobs, specify the name of the column in your Items dataset that contains each item's name.
source§impl CreateBatchInferenceJobInput
impl CreateBatchInferenceJobInput
sourcepub fn builder() -> CreateBatchInferenceJobInputBuilder
pub fn builder() -> CreateBatchInferenceJobInputBuilder
Creates a new builder-style object to manufacture CreateBatchInferenceJobInput.
Trait Implementations§
source§impl Clone for CreateBatchInferenceJobInput
impl Clone for CreateBatchInferenceJobInput
source§fn clone(&self) -> CreateBatchInferenceJobInput
fn clone(&self) -> CreateBatchInferenceJobInput
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read moresource§impl Debug for CreateBatchInferenceJobInput
impl Debug for CreateBatchInferenceJobInput
source§impl PartialEq for CreateBatchInferenceJobInput
impl PartialEq for CreateBatchInferenceJobInput
source§fn eq(&self, other: &CreateBatchInferenceJobInput) -> bool
fn eq(&self, other: &CreateBatchInferenceJobInput) -> bool
self and other values to be equal, and is used
by ==.impl StructuralPartialEq for CreateBatchInferenceJobInput
Auto Trait Implementations§
impl Freeze for CreateBatchInferenceJobInput
impl RefUnwindSafe for CreateBatchInferenceJobInput
impl Send for CreateBatchInferenceJobInput
impl Sync for CreateBatchInferenceJobInput
impl Unpin for CreateBatchInferenceJobInput
impl UnwindSafe for CreateBatchInferenceJobInput
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
source§impl<T> Instrument for T
impl<T> Instrument for T
source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
source§impl<T> IntoEither for T
impl<T> IntoEither for T
source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moresource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more