#[non_exhaustive]pub struct CreateBatchInferenceJobInput {
pub job_name: Option<String>,
pub solution_version_arn: Option<String>,
pub filter_arn: Option<String>,
pub num_results: Option<i32>,
pub job_input: Option<BatchInferenceJobInput>,
pub job_output: Option<BatchInferenceJobOutput>,
pub role_arn: Option<String>,
pub batch_inference_job_config: Option<BatchInferenceJobConfig>,
}Fields (Non-exhaustive)
This struct is marked as non-exhaustive
Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.job_name: Option<String>The name of the batch inference job to create.
solution_version_arn: Option<String>The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
filter_arn: Option<String>The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering Batch Recommendations..
num_results: Option<i32>The number of recommendations to retreive.
job_input: Option<BatchInferenceJobInput>The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
job_output: Option<BatchInferenceJobOutput>The path to the Amazon S3 bucket where the job's output will be stored.
role_arn: Option<String>The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
batch_inference_job_config: Option<BatchInferenceJobConfig>The configuration details of a batch inference job.
Implementations
pub async fn make_operation(
&self,
_config: &Config
) -> Result<Operation<CreateBatchInferenceJob, AwsErrorRetryPolicy>, BuildError>
pub async fn make_operation(
&self,
_config: &Config
) -> Result<Operation<CreateBatchInferenceJob, AwsErrorRetryPolicy>, BuildError>
Consumes the builder and constructs an Operation<CreateBatchInferenceJob>
Creates a new builder-style object to manufacture CreateBatchInferenceJobInput
The Amazon Resource Name (ARN) of the solution version that will be used to generate the batch inference recommendations.
The ARN of the filter to apply to the batch inference job. For more information on using filters, see Filtering Batch Recommendations..
The number of recommendations to retreive.
The Amazon S3 path that leads to the input file to base your recommendations on. The input material must be in JSON format.
The path to the Amazon S3 bucket where the job's output will be stored.
The ARN of the Amazon Identity and Access Management role that has permissions to read and write to your input and output Amazon S3 buckets respectively.
The configuration details of a batch inference job.
Trait Implementations
This method tests for self and other values to be equal, and is used
by ==. Read more
This method tests for !=.
Auto Trait Implementations
impl Send for CreateBatchInferenceJobInput
impl Sync for CreateBatchInferenceJobInput
impl Unpin for CreateBatchInferenceJobInput
impl UnwindSafe for CreateBatchInferenceJobInput
Blanket Implementations
Mutably borrows from an owned value. Read more
Attaches the provided Subscriber to this type, returning a
WithDispatch wrapper. Read more
Attaches the current default Subscriber to this type, returning a
WithDispatch wrapper. Read more