#[non_exhaustive]pub struct CreateEnvironmentInput {Show 25 fields
pub name: Option<String>,
pub execution_role_arn: Option<String>,
pub source_bucket_arn: Option<String>,
pub dag_s3_path: Option<String>,
pub network_configuration: Option<NetworkConfiguration>,
pub plugins_s3_path: Option<String>,
pub plugins_s3_object_version: Option<String>,
pub requirements_s3_path: Option<String>,
pub requirements_s3_object_version: Option<String>,
pub startup_script_s3_path: Option<String>,
pub startup_script_s3_object_version: Option<String>,
pub airflow_configuration_options: Option<HashMap<String, String>>,
pub environment_class: Option<String>,
pub max_workers: Option<i32>,
pub kms_key: Option<String>,
pub airflow_version: Option<String>,
pub logging_configuration: Option<LoggingConfigurationInput>,
pub weekly_maintenance_window_start: Option<String>,
pub tags: Option<HashMap<String, String>>,
pub webserver_access_mode: Option<WebserverAccessMode>,
pub min_workers: Option<i32>,
pub schedulers: Option<i32>,
pub endpoint_management: Option<EndpointManagement>,
pub min_webservers: Option<i32>,
pub max_webservers: Option<i32>,
}
Expand description
This section contains the Amazon Managed Workflows for Apache Airflow (Amazon MWAA) API reference documentation to create an environment. For more information, see Get started with Amazon Managed Workflows for Apache Airflow.
Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. }
syntax; cannot be matched against without a wildcard ..
; and struct update syntax will not work.name: Option<String>
The name of the Amazon MWAA environment. For example, MyMWAAEnvironment
.
execution_role_arn: Option<String>
The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role
. For more information, see Amazon MWAA Execution role.
source_bucket_arn: Option<String>
The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name
. For more information, see Create an Amazon S3 bucket for Amazon MWAA.
dag_s3_path: Option<String>
The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags
. For more information, see Adding or updating DAGs.
network_configuration: Option<NetworkConfiguration>
The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.
plugins_s3_path: Option<String>
The relative path to the plugins.zip
file on your Amazon S3 bucket. For example, plugins.zip
. If specified, then the plugins.zip
version is required. For more information, see Installing custom plugins.
plugins_s3_object_version: Option<String>
The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.
requirements_s3_path: Option<String>
The relative path to the requirements.txt
file on your Amazon S3 bucket. For example, requirements.txt
. If specified, then a version is required. For more information, see Installing Python dependencies.
requirements_s3_object_version: Option<String>
The version of the requirements.txt
file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.
startup_script_s3_path: Option<String>
The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh
.
Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.
startup_script_s3_object_version: Option<String>
The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script.
airflow_configuration_options: Option<HashMap<String, String>>
A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.
environment_class: Option<String>
The environment class type. Valid values: mw1.micro
, mw1.small
, mw1.medium
, mw1.large
, mw1.xlarge
, and mw1.2xlarge
. For more information, see Amazon MWAA environment class.
max_workers: Option<i32>
The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers
field. For example, 20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers
.
kms_key: Option<String>
The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.
airflow_version: Option<String>
The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (Amazon MWAA).
Valid values: 1.10.12
, 2.0.2
, 2.2.2
, 2.4.3
, 2.5.1
, 2.6.3
, 2.7.2
, 2.8.1
, 2.9.2
, 2.10.1
, and 2.10.3
.
logging_configuration: Option<LoggingConfigurationInput>
Defines the Apache Airflow logs to send to CloudWatch Logs.
weekly_maintenance_window_start: Option<String>
The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM
. For example: TUE:03:30
. You can specify a start time in 30 minute increments only.
The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging"
. For more information, see Tagging Amazon Web Services resources.
webserver_access_mode: Option<WebserverAccessMode>
Defines the access mode for the Apache Airflow web server. For more information, see Apache Airflow access modes.
min_workers: Option<i32>
The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers
field. For example, 2
.
schedulers: Option<i32>
The number of Apache Airflow schedulers to run in your environment. Valid values:
-
v2 - For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
. -
v1 - Accepts
1
.
endpoint_management: Option<EndpointManagement>
Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set to CUSTOMER
, you must create, and manage, the VPC endpoints for your VPC. If you choose to create an environment in a shared VPC, you must set this value to CUSTOMER
. In a shared VPC deployment, the environment will remain in PENDING
status until you create the VPC endpoints. If you do not take action to create the endpoints within 72 hours, the status will change to CREATE_FAILED
. You can delete the failed environment and create a new one.
min_webservers: Option<i32>
The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in MinxWebserers
.
Valid values: For environments larger than mw1.micro, accepts values from 2
to 5
. Defaults to 2
for all environment sizes except mw1.micro, which defaults to 1
.
max_webservers: Option<i32>
The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in MaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in MinxWebserers
.
Valid values: For environments larger than mw1.micro, accepts values from 2
to 5
. Defaults to 2
for all environment sizes except mw1.micro, which defaults to 1
.
Implementations§
Source§impl CreateEnvironmentInput
impl CreateEnvironmentInput
Sourcepub fn name(&self) -> Option<&str>
pub fn name(&self) -> Option<&str>
The name of the Amazon MWAA environment. For example, MyMWAAEnvironment
.
Sourcepub fn execution_role_arn(&self) -> Option<&str>
pub fn execution_role_arn(&self) -> Option<&str>
The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role
. For more information, see Amazon MWAA Execution role.
Sourcepub fn source_bucket_arn(&self) -> Option<&str>
pub fn source_bucket_arn(&self) -> Option<&str>
The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name
. For more information, see Create an Amazon S3 bucket for Amazon MWAA.
Sourcepub fn dag_s3_path(&self) -> Option<&str>
pub fn dag_s3_path(&self) -> Option<&str>
The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags
. For more information, see Adding or updating DAGs.
Sourcepub fn network_configuration(&self) -> Option<&NetworkConfiguration>
pub fn network_configuration(&self) -> Option<&NetworkConfiguration>
The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.
Sourcepub fn plugins_s3_path(&self) -> Option<&str>
pub fn plugins_s3_path(&self) -> Option<&str>
The relative path to the plugins.zip
file on your Amazon S3 bucket. For example, plugins.zip
. If specified, then the plugins.zip
version is required. For more information, see Installing custom plugins.
Sourcepub fn plugins_s3_object_version(&self) -> Option<&str>
pub fn plugins_s3_object_version(&self) -> Option<&str>
The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.
Sourcepub fn requirements_s3_path(&self) -> Option<&str>
pub fn requirements_s3_path(&self) -> Option<&str>
The relative path to the requirements.txt
file on your Amazon S3 bucket. For example, requirements.txt
. If specified, then a version is required. For more information, see Installing Python dependencies.
Sourcepub fn requirements_s3_object_version(&self) -> Option<&str>
pub fn requirements_s3_object_version(&self) -> Option<&str>
The version of the requirements.txt
file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.
Sourcepub fn startup_script_s3_path(&self) -> Option<&str>
pub fn startup_script_s3_path(&self) -> Option<&str>
The relative path to the startup shell script in your Amazon S3 bucket. For example, s3://mwaa-environment/startup.sh
.
Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.
Sourcepub fn startup_script_s3_object_version(&self) -> Option<&str>
pub fn startup_script_s3_object_version(&self) -> Option<&str>
The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script.
Sourcepub fn airflow_configuration_options(&self) -> Option<&HashMap<String, String>>
pub fn airflow_configuration_options(&self) -> Option<&HashMap<String, String>>
A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.
Sourcepub fn environment_class(&self) -> Option<&str>
pub fn environment_class(&self) -> Option<&str>
The environment class type. Valid values: mw1.micro
, mw1.small
, mw1.medium
, mw1.large
, mw1.xlarge
, and mw1.2xlarge
. For more information, see Amazon MWAA environment class.
Sourcepub fn max_workers(&self) -> Option<i32>
pub fn max_workers(&self) -> Option<i32>
The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers
field. For example, 20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers
.
Sourcepub fn kms_key(&self) -> Option<&str>
pub fn kms_key(&self) -> Option<&str>
The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.
Sourcepub fn airflow_version(&self) -> Option<&str>
pub fn airflow_version(&self) -> Option<&str>
The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (Amazon MWAA).
Valid values: 1.10.12
, 2.0.2
, 2.2.2
, 2.4.3
, 2.5.1
, 2.6.3
, 2.7.2
, 2.8.1
, 2.9.2
, 2.10.1
, and 2.10.3
.
Sourcepub fn logging_configuration(&self) -> Option<&LoggingConfigurationInput>
pub fn logging_configuration(&self) -> Option<&LoggingConfigurationInput>
Defines the Apache Airflow logs to send to CloudWatch Logs.
Sourcepub fn weekly_maintenance_window_start(&self) -> Option<&str>
pub fn weekly_maintenance_window_start(&self) -> Option<&str>
The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM
. For example: TUE:03:30
. You can specify a start time in 30 minute increments only.
The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging"
. For more information, see Tagging Amazon Web Services resources.
Sourcepub fn webserver_access_mode(&self) -> Option<&WebserverAccessMode>
pub fn webserver_access_mode(&self) -> Option<&WebserverAccessMode>
Defines the access mode for the Apache Airflow web server. For more information, see Apache Airflow access modes.
Sourcepub fn min_workers(&self) -> Option<i32>
pub fn min_workers(&self) -> Option<i32>
The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers
field. For example, 2
.
Sourcepub fn schedulers(&self) -> Option<i32>
pub fn schedulers(&self) -> Option<i32>
The number of Apache Airflow schedulers to run in your environment. Valid values:
-
v2 - For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
. -
v1 - Accepts
1
.
Sourcepub fn endpoint_management(&self) -> Option<&EndpointManagement>
pub fn endpoint_management(&self) -> Option<&EndpointManagement>
Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set to CUSTOMER
, you must create, and manage, the VPC endpoints for your VPC. If you choose to create an environment in a shared VPC, you must set this value to CUSTOMER
. In a shared VPC deployment, the environment will remain in PENDING
status until you create the VPC endpoints. If you do not take action to create the endpoints within 72 hours, the status will change to CREATE_FAILED
. You can delete the failed environment and create a new one.
Sourcepub fn min_webservers(&self) -> Option<i32>
pub fn min_webservers(&self) -> Option<i32>
The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in MinxWebserers
.
Valid values: For environments larger than mw1.micro, accepts values from 2
to 5
. Defaults to 2
for all environment sizes except mw1.micro, which defaults to 1
.
Sourcepub fn max_webservers(&self) -> Option<i32>
pub fn max_webservers(&self) -> Option<i32>
The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in MaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in MinxWebserers
.
Valid values: For environments larger than mw1.micro, accepts values from 2
to 5
. Defaults to 2
for all environment sizes except mw1.micro, which defaults to 1
.
Source§impl CreateEnvironmentInput
impl CreateEnvironmentInput
Sourcepub fn builder() -> CreateEnvironmentInputBuilder
pub fn builder() -> CreateEnvironmentInputBuilder
Creates a new builder-style object to manufacture CreateEnvironmentInput
.
Trait Implementations§
Source§impl Clone for CreateEnvironmentInput
impl Clone for CreateEnvironmentInput
Source§fn clone(&self) -> CreateEnvironmentInput
fn clone(&self) -> CreateEnvironmentInput
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moreSource§impl Debug for CreateEnvironmentInput
impl Debug for CreateEnvironmentInput
Source§impl PartialEq for CreateEnvironmentInput
impl PartialEq for CreateEnvironmentInput
impl StructuralPartialEq for CreateEnvironmentInput
Auto Trait Implementations§
impl Freeze for CreateEnvironmentInput
impl RefUnwindSafe for CreateEnvironmentInput
impl Send for CreateEnvironmentInput
impl Sync for CreateEnvironmentInput
impl Unpin for CreateEnvironmentInput
impl UnwindSafe for CreateEnvironmentInput
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the foreground set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red()
and
green()
, which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg()
:
use yansi::{Paint, Color};
painted.fg(Color::White);
Set foreground color to white using white()
.
use yansi::Paint;
painted.white();
Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the background set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red()
and
on_green()
, which have the same functionality but
are pithier.
§Example
Set background color to red using fg()
:
use yansi::{Paint, Color};
painted.bg(Color::Red);
Set background color to red using on_red()
.
use yansi::Paint;
painted.on_red();
Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute
value
.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold()
and
underline()
, which have the same functionality
but are pithier.
§Example
Make text bold using attr()
:
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);
Make text bold using using bold()
.
use yansi::Paint;
painted.bold();
Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi
Quirk
value
.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask()
and
wrap()
, which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk()
:
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);
Enable wrapping using wrap()
.
use yansi::Paint;
painted.wrap();
Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition
value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted
only when both stdout
and stderr
are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);