Struct aws_sdk_personalize::types::BatchInferenceJob
source · #[non_exhaustive]pub struct BatchInferenceJob {Show 15 fields
pub job_name: Option<String>,
pub batch_inference_job_arn: Option<String>,
pub filter_arn: Option<String>,
pub failure_reason: Option<String>,
pub solution_version_arn: Option<String>,
pub num_results: Option<i32>,
pub job_input: Option<BatchInferenceJobInput>,
pub job_output: Option<BatchInferenceJobOutput>,
pub batch_inference_job_config: Option<BatchInferenceJobConfig>,
pub role_arn: Option<String>,
pub batch_inference_job_mode: Option<BatchInferenceJobMode>,
pub theme_generation_config: Option<ThemeGenerationConfig>,
pub status: Option<String>,
pub creation_date_time: Option<DateTime>,
pub last_updated_date_time: Option<DateTime>,
}Expand description
Contains information on a batch inference job.
Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.job_name: Option<String>The name of the batch inference job.
batch_inference_job_arn: Option<String>The Amazon Resource Name (ARN) of the batch inference job.
filter_arn: Option<String>The ARN of the filter used on the batch inference job.
failure_reason: Option<String>If the batch inference job failed, the reason for the failure.
solution_version_arn: Option<String>The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created.
num_results: Option<i32>The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records.
job_input: Option<BatchInferenceJobInput>The Amazon S3 path that leads to the input data used to generate the batch inference job.
job_output: Option<BatchInferenceJobOutput>The Amazon S3 bucket that contains the output data generated by the batch inference job.
batch_inference_job_config: Option<BatchInferenceJobConfig>A string to string map of the configuration details of a batch inference job.
role_arn: Option<String>The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job.
batch_inference_job_mode: Option<BatchInferenceJobMode>The job's mode.
theme_generation_config: Option<ThemeGenerationConfig>The job's theme generation settings.
status: Option<String>The status of the batch inference job. The status is one of the following values:
-
PENDING
-
IN PROGRESS
-
ACTIVE
-
CREATE FAILED
creation_date_time: Option<DateTime>The time at which the batch inference job was created.
last_updated_date_time: Option<DateTime>The time at which the batch inference job was last updated.
Implementations§
source§impl BatchInferenceJob
impl BatchInferenceJob
sourcepub fn batch_inference_job_arn(&self) -> Option<&str>
pub fn batch_inference_job_arn(&self) -> Option<&str>
The Amazon Resource Name (ARN) of the batch inference job.
sourcepub fn filter_arn(&self) -> Option<&str>
pub fn filter_arn(&self) -> Option<&str>
The ARN of the filter used on the batch inference job.
sourcepub fn failure_reason(&self) -> Option<&str>
pub fn failure_reason(&self) -> Option<&str>
If the batch inference job failed, the reason for the failure.
sourcepub fn solution_version_arn(&self) -> Option<&str>
pub fn solution_version_arn(&self) -> Option<&str>
The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created.
sourcepub fn num_results(&self) -> Option<i32>
pub fn num_results(&self) -> Option<i32>
The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records.
sourcepub fn job_input(&self) -> Option<&BatchInferenceJobInput>
pub fn job_input(&self) -> Option<&BatchInferenceJobInput>
The Amazon S3 path that leads to the input data used to generate the batch inference job.
sourcepub fn job_output(&self) -> Option<&BatchInferenceJobOutput>
pub fn job_output(&self) -> Option<&BatchInferenceJobOutput>
The Amazon S3 bucket that contains the output data generated by the batch inference job.
sourcepub fn batch_inference_job_config(&self) -> Option<&BatchInferenceJobConfig>
pub fn batch_inference_job_config(&self) -> Option<&BatchInferenceJobConfig>
A string to string map of the configuration details of a batch inference job.
sourcepub fn role_arn(&self) -> Option<&str>
pub fn role_arn(&self) -> Option<&str>
The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job.
sourcepub fn batch_inference_job_mode(&self) -> Option<&BatchInferenceJobMode>
pub fn batch_inference_job_mode(&self) -> Option<&BatchInferenceJobMode>
The job's mode.
sourcepub fn theme_generation_config(&self) -> Option<&ThemeGenerationConfig>
pub fn theme_generation_config(&self) -> Option<&ThemeGenerationConfig>
The job's theme generation settings.
sourcepub fn status(&self) -> Option<&str>
pub fn status(&self) -> Option<&str>
The status of the batch inference job. The status is one of the following values:
-
PENDING
-
IN PROGRESS
-
ACTIVE
-
CREATE FAILED
sourcepub fn creation_date_time(&self) -> Option<&DateTime>
pub fn creation_date_time(&self) -> Option<&DateTime>
The time at which the batch inference job was created.
sourcepub fn last_updated_date_time(&self) -> Option<&DateTime>
pub fn last_updated_date_time(&self) -> Option<&DateTime>
The time at which the batch inference job was last updated.
source§impl BatchInferenceJob
impl BatchInferenceJob
sourcepub fn builder() -> BatchInferenceJobBuilder
pub fn builder() -> BatchInferenceJobBuilder
Creates a new builder-style object to manufacture BatchInferenceJob.
Trait Implementations§
source§impl Clone for BatchInferenceJob
impl Clone for BatchInferenceJob
source§fn clone(&self) -> BatchInferenceJob
fn clone(&self) -> BatchInferenceJob
1.0.0 · source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read moresource§impl Debug for BatchInferenceJob
impl Debug for BatchInferenceJob
source§impl PartialEq for BatchInferenceJob
impl PartialEq for BatchInferenceJob
source§fn eq(&self, other: &BatchInferenceJob) -> bool
fn eq(&self, other: &BatchInferenceJob) -> bool
self and other values to be equal, and is used
by ==.impl StructuralPartialEq for BatchInferenceJob
Auto Trait Implementations§
impl Freeze for BatchInferenceJob
impl RefUnwindSafe for BatchInferenceJob
impl Send for BatchInferenceJob
impl Sync for BatchInferenceJob
impl Unpin for BatchInferenceJob
impl UnwindSafe for BatchInferenceJob
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
source§impl<T> Instrument for T
impl<T> Instrument for T
source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
source§impl<T> IntoEither for T
impl<T> IntoEither for T
source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moresource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more