#[non_exhaustive]
pub struct CreateEnvironmentInput {
Show 22 fields pub name: Option<String>, pub execution_role_arn: Option<String>, pub source_bucket_arn: Option<String>, pub dag_s3_path: Option<String>, pub network_configuration: Option<NetworkConfiguration>, pub plugins_s3_path: Option<String>, pub plugins_s3_object_version: Option<String>, pub requirements_s3_path: Option<String>, pub requirements_s3_object_version: Option<String>, pub startup_script_s3_path: Option<String>, pub startup_script_s3_object_version: Option<String>, pub airflow_configuration_options: Option<HashMap<String, String>>, pub environment_class: Option<String>, pub max_workers: Option<i32>, pub kms_key: Option<String>, pub airflow_version: Option<String>, pub logging_configuration: Option<LoggingConfigurationInput>, pub weekly_maintenance_window_start: Option<String>, pub tags: Option<HashMap<String, String>>, pub webserver_access_mode: Option<WebserverAccessMode>, pub min_workers: Option<i32>, pub schedulers: Option<i32>,
}
Expand description

This section contains the Amazon Managed Workflows for Apache Airflow (MWAA) API reference documentation to create an environment. For more information, see Get started with Amazon Managed Workflows for Apache Airflow.

Fields (Non-exhaustive)§

This struct is marked as non-exhaustive
Non-exhaustive structs could have additional fields added in future. Therefore, non-exhaustive structs cannot be constructed in external crates using the traditional Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.
§name: Option<String>

The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

§execution_role_arn: Option<String>

The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role. For more information, see Amazon MWAA Execution role.

§source_bucket_arn: Option<String>

The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.

§dag_s3_path: Option<String>

The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. For more information, see Adding or updating DAGs.

§network_configuration: Option<NetworkConfiguration>

The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.

§plugins_s3_path: Option<String>

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.

§plugins_s3_object_version: Option<String>

The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.

§requirements_s3_path: Option<String>

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a version is required. For more information, see Installing Python dependencies.

§requirements_s3_object_version: Option<String>

The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.

§startup_script_s3_path: Option<String>

The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh.

Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.

§startup_script_s3_object_version: Option<String>

The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.

Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:

3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo

For more information, see Using a startup script.

§airflow_configuration_options: Option<HashMap<String, String>>

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.

§environment_class: Option<String>

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large. For more information, see Amazon MWAA environment class.

§max_workers: Option<i32>

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

§kms_key: Option<String>

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.

§airflow_version: Option<String>

The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. Valid values: 1.10.12, 2.0.2, 2.2.2, 2.4.3, and 2.5.1. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

§logging_configuration: Option<LoggingConfigurationInput>

Defines the Apache Airflow logs to send to CloudWatch Logs.

§weekly_maintenance_window_start: Option<String>

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.

§tags: Option<HashMap<String, String>>

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". For more information, see Tagging Amazon Web Services resources.

§webserver_access_mode: Option<WebserverAccessMode>

The Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.

§min_workers: Option<i32>

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

§schedulers: Option<i32>

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2 - Accepts between 2 to 5. Defaults to 2.

  • v1 - Accepts 1.

Implementations§

source§

impl CreateEnvironmentInput

source

pub fn name(&self) -> Option<&str>

The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

source

pub fn execution_role_arn(&self) -> Option<&str>

The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role. For more information, see Amazon MWAA Execution role.

source

pub fn source_bucket_arn(&self) -> Option<&str>

The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.

source

pub fn dag_s3_path(&self) -> Option<&str>

The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. For more information, see Adding or updating DAGs.

source

pub fn network_configuration(&self) -> Option<&NetworkConfiguration>

The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.

source

pub fn plugins_s3_path(&self) -> Option<&str>

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.

source

pub fn plugins_s3_object_version(&self) -> Option<&str>

The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.

source

pub fn requirements_s3_path(&self) -> Option<&str>

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a version is required. For more information, see Installing Python dependencies.

source

pub fn requirements_s3_object_version(&self) -> Option<&str>

The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.

source

pub fn startup_script_s3_path(&self) -> Option<&str>

The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh.

Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.

source

pub fn startup_script_s3_object_version(&self) -> Option<&str>

The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.

Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:

3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo

For more information, see Using a startup script.

source

pub fn airflow_configuration_options(&self) -> Option<&HashMap<String, String>>

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.

source

pub fn environment_class(&self) -> Option<&str>

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large. For more information, see Amazon MWAA environment class.

source

pub fn max_workers(&self) -> Option<i32>

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

source

pub fn kms_key(&self) -> Option<&str>

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.

source

pub fn airflow_version(&self) -> Option<&str>

The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. Valid values: 1.10.12, 2.0.2, 2.2.2, 2.4.3, and 2.5.1. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

source

pub fn logging_configuration(&self) -> Option<&LoggingConfigurationInput>

Defines the Apache Airflow logs to send to CloudWatch Logs.

source

pub fn weekly_maintenance_window_start(&self) -> Option<&str>

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.

source

pub fn tags(&self) -> Option<&HashMap<String, String>>

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". For more information, see Tagging Amazon Web Services resources.

source

pub fn webserver_access_mode(&self) -> Option<&WebserverAccessMode>

The Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.

source

pub fn min_workers(&self) -> Option<i32>

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

source

pub fn schedulers(&self) -> Option<i32>

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2 - Accepts between 2 to 5. Defaults to 2.

  • v1 - Accepts 1.

source§

impl CreateEnvironmentInput

source

pub fn builder() -> CreateEnvironmentInputBuilder

Creates a new builder-style object to manufacture CreateEnvironmentInput.

Trait Implementations§

source§

impl Clone for CreateEnvironmentInput

source§

fn clone(&self) -> CreateEnvironmentInput

Returns a copy of the value. Read more
1.0.0 · source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
source§

impl Debug for CreateEnvironmentInput

source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more
source§

impl PartialEq<CreateEnvironmentInput> for CreateEnvironmentInput

source§

fn eq(&self, other: &CreateEnvironmentInput) -> bool

This method tests for self and other values to be equal, and is used by ==.
1.0.0 · source§

fn ne(&self, other: &Rhs) -> bool

This method tests for !=. The default implementation is almost always sufficient, and should not be overridden without very good reason.
source§

impl StructuralPartialEq for CreateEnvironmentInput

Auto Trait Implementations§

Blanket Implementations§

source§

impl<T> Any for Twhere T: 'static + ?Sized,

source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
source§

impl<T> Borrow<T> for Twhere T: ?Sized,

source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
source§

impl<T> BorrowMut<T> for Twhere T: ?Sized,

source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
source§

impl<T> From<T> for T

source§

fn from(t: T) -> T

Returns the argument unchanged.

source§

impl<T> Instrument for T

source§

fn instrument(self, span: Span) -> Instrumented<Self>

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more
source§

fn in_current_span(self) -> Instrumented<Self>

Instruments this type with the current Span, returning an Instrumented wrapper. Read more
source§

impl<T, U> Into<U> for Twhere U: From<T>,

source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

source§

impl<T> Same<T> for T

§

type Output = T

Should always be Self
source§

impl<T> ToOwned for Twhere T: Clone,

§

type Owned = T

The resulting type after obtaining ownership.
source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
source§

impl<T, U> TryFrom<U> for Twhere U: Into<T>,

§

type Error = Infallible

The type returned in the event of a conversion error.
source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
source§

impl<T, U> TryInto<U> for Twhere U: TryFrom<T>,

§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.
source§

impl<T> WithSubscriber for T

source§

fn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self>where S: Into<Dispatch>,

Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more
source§

fn with_current_subscriber(self) -> WithDispatch<Self>

Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more