logo
pub struct BatchInferenceJob {
Show 13 fields pub batch_inference_job_arn: Option<String>, pub batch_inference_job_config: Option<BatchInferenceJobConfig>, pub creation_date_time: Option<f64>, pub failure_reason: Option<String>, pub filter_arn: Option<String>, pub job_input: Option<BatchInferenceJobInput>, pub job_name: Option<String>, pub job_output: Option<BatchInferenceJobOutput>, pub last_updated_date_time: Option<f64>, pub num_results: Option<i64>, pub role_arn: Option<String>, pub solution_version_arn: Option<String>, pub status: Option<String>,
}
Expand description

Contains information on a batch inference job.

Fields

batch_inference_job_arn: Option<String>

The Amazon Resource Name (ARN) of the batch inference job.

batch_inference_job_config: Option<BatchInferenceJobConfig>

A string to string map of the configuration details of a batch inference job.

creation_date_time: Option<f64>

The time at which the batch inference job was created.

failure_reason: Option<String>

If the batch inference job failed, the reason for the failure.

filter_arn: Option<String>

The ARN of the filter used on the batch inference job.

job_input: Option<BatchInferenceJobInput>

The Amazon S3 path that leads to the input data used to generate the batch inference job.

job_name: Option<String>

The name of the batch inference job.

job_output: Option<BatchInferenceJobOutput>

The Amazon S3 bucket that contains the output data generated by the batch inference job.

last_updated_date_time: Option<f64>

The time at which the batch inference job was last updated.

num_results: Option<i64>

The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records.

role_arn: Option<String>

The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job.

solution_version_arn: Option<String>

The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created.

status: Option<String>

The status of the batch inference job. The status is one of the following values:

  • PENDING

  • IN PROGRESS

  • ACTIVE

  • CREATE FAILED

Trait Implementations

Returns a copy of the value. Read more

Performs copy-assignment from source. Read more

Formats the value using the given formatter. Read more

Returns the “default value” for a type. Read more

Deserialize this value from the given Serde deserializer. Read more

This method tests for self and other values to be equal, and is used by ==. Read more

This method tests for !=.

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more

Instruments this type with the current Span, returning an Instrumented wrapper. Read more

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Should always be Self

The resulting type after obtaining ownership.

Creates owned data from borrowed data, usually by cloning. Read more

🔬 This is a nightly-only experimental API. (toowned_clone_into)

Uses borrowed data to replace owned data, usually by cloning. Read more

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.

Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more

Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more