pub struct CreateEnvironmentFluentBuilder { /* private fields */ }
Expand description

Fluent builder constructing a request to CreateEnvironment.

Creates an Amazon Managed Workflows for Apache Airflow (MWAA) environment.

Implementations§

source§

impl CreateEnvironmentFluentBuilder

source

pub fn as_input(&self) -> &CreateEnvironmentInputBuilder

Access the CreateEnvironment as a reference.

source

pub async fn send( self ) -> Result<CreateEnvironmentOutput, SdkError<CreateEnvironmentError, HttpResponse>>

Sends the request and returns the response.

If an error occurs, an SdkError will be returned with additional details that can be matched against.

By default, any retryable failures will be retried twice. Retry behavior is configurable with the RetryConfig, which can be set when configuring the client.

source

pub async fn customize( self ) -> Result<CustomizableOperation<CreateEnvironmentOutput, CreateEnvironmentError>, SdkError<CreateEnvironmentError>>

Consumes this builder, creating a customizable operation that can be modified before being sent.

source

pub fn name(self, input: impl Into<String>) -> Self

The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

source

pub fn set_name(self, input: Option<String>) -> Self

The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

source

pub fn get_name(&self) -> &Option<String>

The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

source

pub fn execution_role_arn(self, input: impl Into<String>) -> Self

The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role. For more information, see Amazon MWAA Execution role.

source

pub fn set_execution_role_arn(self, input: Option<String>) -> Self

The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role. For more information, see Amazon MWAA Execution role.

source

pub fn get_execution_role_arn(&self) -> &Option<String>

The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role. For more information, see Amazon MWAA Execution role.

source

pub fn source_bucket_arn(self, input: impl Into<String>) -> Self

The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.

source

pub fn set_source_bucket_arn(self, input: Option<String>) -> Self

The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.

source

pub fn get_source_bucket_arn(&self) -> &Option<String>

The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.

source

pub fn dag_s3_path(self, input: impl Into<String>) -> Self

The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. For more information, see Adding or updating DAGs.

source

pub fn set_dag_s3_path(self, input: Option<String>) -> Self

The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. For more information, see Adding or updating DAGs.

source

pub fn get_dag_s3_path(&self) -> &Option<String>

The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. For more information, see Adding or updating DAGs.

source

pub fn network_configuration(self, input: NetworkConfiguration) -> Self

The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.

source

pub fn set_network_configuration( self, input: Option<NetworkConfiguration> ) -> Self

The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.

source

pub fn get_network_configuration(&self) -> &Option<NetworkConfiguration>

The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.

source

pub fn plugins_s3_path(self, input: impl Into<String>) -> Self

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.

source

pub fn set_plugins_s3_path(self, input: Option<String>) -> Self

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.

source

pub fn get_plugins_s3_path(&self) -> &Option<String>

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.

source

pub fn plugins_s3_object_version(self, input: impl Into<String>) -> Self

The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.

source

pub fn set_plugins_s3_object_version(self, input: Option<String>) -> Self

The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.

source

pub fn get_plugins_s3_object_version(&self) -> &Option<String>

The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.

source

pub fn requirements_s3_path(self, input: impl Into<String>) -> Self

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a version is required. For more information, see Installing Python dependencies.

source

pub fn set_requirements_s3_path(self, input: Option<String>) -> Self

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a version is required. For more information, see Installing Python dependencies.

source

pub fn get_requirements_s3_path(&self) -> &Option<String>

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a version is required. For more information, see Installing Python dependencies.

source

pub fn requirements_s3_object_version(self, input: impl Into<String>) -> Self

The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.

source

pub fn set_requirements_s3_object_version(self, input: Option<String>) -> Self

The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.

source

pub fn get_requirements_s3_object_version(&self) -> &Option<String>

The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.

source

pub fn startup_script_s3_path(self, input: impl Into<String>) -> Self

The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh.

Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.

source

pub fn set_startup_script_s3_path(self, input: Option<String>) -> Self

The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh.

Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.

source

pub fn get_startup_script_s3_path(&self) -> &Option<String>

The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh.

Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.

source

pub fn startup_script_s3_object_version(self, input: impl Into<String>) -> Self

The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.

Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:

3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo

For more information, see Using a startup script.

source

pub fn set_startup_script_s3_object_version(self, input: Option<String>) -> Self

The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.

Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:

3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo

For more information, see Using a startup script.

source

pub fn get_startup_script_s3_object_version(&self) -> &Option<String>

The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.

Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:

3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo

For more information, see Using a startup script.

source

pub fn airflow_configuration_options( self, k: impl Into<String>, v: impl Into<String> ) -> Self

Adds a key-value pair to AirflowConfigurationOptions.

To override the contents of this collection use set_airflow_configuration_options.

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.

source

pub fn set_airflow_configuration_options( self, input: Option<HashMap<String, String>> ) -> Self

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.

source

pub fn get_airflow_configuration_options( &self ) -> &Option<HashMap<String, String>>

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.

source

pub fn environment_class(self, input: impl Into<String>) -> Self

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large. For more information, see Amazon MWAA environment class.

source

pub fn set_environment_class(self, input: Option<String>) -> Self

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large. For more information, see Amazon MWAA environment class.

source

pub fn get_environment_class(&self) -> &Option<String>

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large. For more information, see Amazon MWAA environment class.

source

pub fn max_workers(self, input: i32) -> Self

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

source

pub fn set_max_workers(self, input: Option<i32>) -> Self

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

source

pub fn get_max_workers(&self) -> &Option<i32>

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

source

pub fn kms_key(self, input: impl Into<String>) -> Self

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.

source

pub fn set_kms_key(self, input: Option<String>) -> Self

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.

source

pub fn get_kms_key(&self) -> &Option<String>

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.

source

pub fn airflow_version(self, input: impl Into<String>) -> Self

The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. Valid values: 1.10.12, 2.0.2, 2.2.2, 2.4.3, and 2.5.1. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

source

pub fn set_airflow_version(self, input: Option<String>) -> Self

The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. Valid values: 1.10.12, 2.0.2, 2.2.2, 2.4.3, and 2.5.1. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

source

pub fn get_airflow_version(&self) -> &Option<String>

The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. Valid values: 1.10.12, 2.0.2, 2.2.2, 2.4.3, and 2.5.1. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

source

pub fn logging_configuration(self, input: LoggingConfigurationInput) -> Self

Defines the Apache Airflow logs to send to CloudWatch Logs.

source

pub fn set_logging_configuration( self, input: Option<LoggingConfigurationInput> ) -> Self

Defines the Apache Airflow logs to send to CloudWatch Logs.

source

pub fn get_logging_configuration(&self) -> &Option<LoggingConfigurationInput>

Defines the Apache Airflow logs to send to CloudWatch Logs.

source

pub fn weekly_maintenance_window_start(self, input: impl Into<String>) -> Self

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.

source

pub fn set_weekly_maintenance_window_start(self, input: Option<String>) -> Self

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.

source

pub fn get_weekly_maintenance_window_start(&self) -> &Option<String>

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.

source

pub fn tags(self, k: impl Into<String>, v: impl Into<String>) -> Self

Adds a key-value pair to Tags.

To override the contents of this collection use set_tags.

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". For more information, see Tagging Amazon Web Services resources.

source

pub fn set_tags(self, input: Option<HashMap<String, String>>) -> Self

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". For more information, see Tagging Amazon Web Services resources.

source

pub fn get_tags(&self) -> &Option<HashMap<String, String>>

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". For more information, see Tagging Amazon Web Services resources.

source

pub fn webserver_access_mode(self, input: WebserverAccessMode) -> Self

The Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.

source

pub fn set_webserver_access_mode( self, input: Option<WebserverAccessMode> ) -> Self

The Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.

source

pub fn get_webserver_access_mode(&self) -> &Option<WebserverAccessMode>

The Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.

source

pub fn min_workers(self, input: i32) -> Self

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

source

pub fn set_min_workers(self, input: Option<i32>) -> Self

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

source

pub fn get_min_workers(&self) -> &Option<i32>

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

source

pub fn schedulers(self, input: i32) -> Self

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2 - Accepts between 2 to 5. Defaults to 2.

  • v1 - Accepts 1.

source

pub fn set_schedulers(self, input: Option<i32>) -> Self

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2 - Accepts between 2 to 5. Defaults to 2.

  • v1 - Accepts 1.

source

pub fn get_schedulers(&self) -> &Option<i32>

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2 - Accepts between 2 to 5. Defaults to 2.

  • v1 - Accepts 1.

Trait Implementations§

source§

impl Clone for CreateEnvironmentFluentBuilder

source§

fn clone(&self) -> CreateEnvironmentFluentBuilder

Returns a copy of the value. Read more
1.0.0 · source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
source§

impl Debug for CreateEnvironmentFluentBuilder

source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more

Auto Trait Implementations§

Blanket Implementations§

source§

impl<T> Any for Twhere T: 'static + ?Sized,

source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
source§

impl<T> Borrow<T> for Twhere T: ?Sized,

source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
source§

impl<T> BorrowMut<T> for Twhere T: ?Sized,

source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
source§

impl<T> From<T> for T

source§

fn from(t: T) -> T

Returns the argument unchanged.

source§

impl<T> Instrument for T

source§

fn instrument(self, span: Span) -> Instrumented<Self>

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more
source§

fn in_current_span(self) -> Instrumented<Self>

Instruments this type with the current Span, returning an Instrumented wrapper. Read more
source§

impl<T, U> Into<U> for Twhere U: From<T>,

source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

source§

impl<T> Same<T> for T

§

type Output = T

Should always be Self
source§

impl<T> ToOwned for Twhere T: Clone,

§

type Owned = T

The resulting type after obtaining ownership.
source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
source§

impl<T, U> TryFrom<U> for Twhere U: Into<T>,

§

type Error = Infallible

The type returned in the event of a conversion error.
source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
source§

impl<T, U> TryInto<U> for Twhere U: TryFrom<T>,

§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.
source§

impl<T> WithSubscriber for T

source§

fn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self>where S: Into<Dispatch>,

Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more
source§

fn with_current_subscriber(self) -> WithDispatch<Self>

Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more