Expand description
Data structures used by operation inputs/outputs.
Modules§
Structs§
- AwsVpc
Configuration This structure specifies the VPC subnets and security groups for the task, and whether a public IP address is to be used. This structure is relevant only for ECS tasks that use the
awsvpc
network mode.- Batch
Array Properties The array properties for the submitted job, such as the size of the array. The array size can be between 2 and 10,000. If you specify array properties for a job, it becomes an array job. This parameter is used only if the target is an Batch job.
- Batch
Container Overrides The overrides that are sent to a container.
- Batch
Environment Variable The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition.
Environment variables cannot start with "
Batch
". This naming convention is reserved for variables that Batch sets.- Batch
JobDependency An object that represents an Batch job dependency.
- Batch
Resource Requirement The type and amount of a resource to assign to a container. The supported resources include
GPU
,MEMORY
, andVCPU
.- Batch
Retry Strategy The retry strategy that's associated with a job. For more information, see Automated job retries in the Batch User Guide.
- Capacity
Provider Strategy Item The details of a capacity provider strategy. To learn more, see CapacityProviderStrategyItem in the Amazon ECS API Reference.
- Cloudwatch
Logs LogDestination The Amazon CloudWatch Logs logging configuration settings for the pipe.
- Cloudwatch
Logs LogDestination Parameters The Amazon CloudWatch Logs logging configuration settings for the pipe.
- Dead
Letter Config A
DeadLetterConfig
object that contains information about a dead-letter queue configuration.- Dimension
Mapping Maps source data to a dimension in the target Timestream for LiveAnalytics table.
For more information, see Amazon Timestream for LiveAnalytics concepts
- EcsContainer
Override The overrides that are sent to a container. An empty container override can be passed in. An example of an empty container override is
{"containerOverrides": \[ \] }
. If a non-empty container override is specified, thename
parameter must be included.- EcsEnvironment
File A list of files containing the environment variables to pass to a container. You can specify up to ten environment files. The file must have a
.env
file extension. Each line in an environment file should contain an environment variable inVARIABLE=VALUE
format. Lines beginning with#
are treated as comments and are ignored. For more information about the environment variable file syntax, see Declare default environment variables in file.If there are environment variables specified using the
environment
parameter in a container definition, they take precedence over the variables contained within an environment file. If multiple environment files are specified that contain the same variable, they're processed from the top down. We recommend that you use unique variable names. For more information, see Specifying environment variables in the Amazon Elastic Container Service Developer Guide.This parameter is only supported for tasks hosted on Fargate using the following platform versions:
-
Linux platform version
1.4.0
or later. -
Windows platform version
1.0.0
or later.
-
- EcsEnvironment
Variable The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition. You must also specify a container name.
- EcsEphemeral
Storage The amount of ephemeral storage to allocate for the task. This parameter is used to expand the total amount of ephemeral storage available, beyond the default amount, for tasks hosted on Fargate. For more information, see Fargate task storage in the Amazon ECS User Guide for Fargate.
This parameter is only supported for tasks hosted on Fargate using Linux platform version
1.4.0
or later. This parameter is not supported for Windows containers on Fargate.- EcsInference
Accelerator Override Details on an Elastic Inference accelerator task override. This parameter is used to override the Elastic Inference accelerator specified in the task definition. For more information, see Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide.
- EcsResource
Requirement The type and amount of a resource to assign to a container. The supported resource types are GPUs and Elastic Inference accelerators. For more information, see Working with GPUs on Amazon ECS or Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide
- EcsTask
Override The overrides that are associated with a task.
- Filter
Filter events using an event pattern. For more information, see Events and Event Patterns in the Amazon EventBridge User Guide.
- Filter
Criteria The collection of event patterns used to filter events.
To remove a filter, specify a
FilterCriteria
object with an empty array ofFilter
objects.For more information, see Events and Event Patterns in the Amazon EventBridge User Guide.
- Firehose
LogDestination The Amazon Data Firehose logging configuration settings for the pipe.
- Firehose
LogDestination Parameters The Amazon Data Firehose logging configuration settings for the pipe.
- Multi
Measure Attribute Mapping A mapping of a source event data field to a measure in a Timestream for LiveAnalytics record.
- Multi
Measure Mapping Maps multiple measures from the source event to the same Timestream for LiveAnalytics record.
For more information, see Amazon Timestream for LiveAnalytics concepts
- Network
Configuration This structure specifies the network configuration for an Amazon ECS task.
- Pipe
An object that represents a pipe. Amazon EventBridgePipes connect event sources to targets and reduces the need for specialized knowledge and integration code.
- Pipe
Enrichment Http Parameters These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations. In the latter case, these are merged with any InvocationParameters specified on the Connection, with any values from the Connection taking precedence.
- Pipe
Enrichment Parameters The parameters required to set up enrichment on your pipe.
- Pipe
LogConfiguration The logging configuration settings for the pipe.
- Pipe
LogConfiguration Parameters Specifies the logging configuration settings for the pipe.
When you call
UpdatePipe
, EventBridge updates the fields in thePipeLogConfigurationParameters
object atomically as one and overrides existing values. This is by design. If you don't specify an optional field in any of the Amazon Web Services service parameters objects (CloudwatchLogsLogDestinationParameters
,FirehoseLogDestinationParameters
, orS3LogDestinationParameters
), EventBridge sets that field to its system-default value during the update.For example, suppose when you created the pipe you specified a Firehose stream log destination. You then update the pipe to add an Amazon S3 log destination. In addition to specifying the
S3LogDestinationParameters
for the new log destination, you must also specify the fields in theFirehoseLogDestinationParameters
object in order to retain the Firehose stream log destination.For more information on generating pipe log records, see Log EventBridge Pipes in the Amazon EventBridge User Guide.
- Pipe
Source Active MqBroker Parameters The parameters for using an Active MQ broker as a source.
- Pipe
Source Dynamo DbStream Parameters The parameters for using a DynamoDB stream as a source.
- Pipe
Source Kinesis Stream Parameters The parameters for using a Kinesis stream as a source.
- Pipe
Source Managed Streaming Kafka Parameters The parameters for using an MSK stream as a source.
- Pipe
Source Parameters The parameters required to set up a source for your pipe.
- Pipe
Source Rabbit MqBroker Parameters The parameters for using a Rabbit MQ broker as a source.
- Pipe
Source Self Managed Kafka Parameters The parameters for using a self-managed Apache Kafka stream as a source.
A self managed cluster refers to any Apache Kafka cluster not hosted by Amazon Web Services. This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud, CloudKarafka, or Redpanda. For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide.
- Pipe
Source SqsQueue Parameters The parameters for using a Amazon SQS stream as a source.
- Pipe
Target Batch JobParameters The parameters for using an Batch job as a target.
- Pipe
Target Cloud Watch Logs Parameters The parameters for using an CloudWatch Logs log stream as a target.
- Pipe
Target EcsTask Parameters The parameters for using an Amazon ECS task as a target.
- Pipe
Target Event Bridge Event BusParameters The parameters for using an EventBridge event bus as a target.
- Pipe
Target Http Parameters These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations.
- Pipe
Target Kinesis Stream Parameters The parameters for using a Kinesis stream as a target.
- Pipe
Target Lambda Function Parameters The parameters for using a Lambda function as a target.
- Pipe
Target Parameters The parameters required to set up a target for your pipe.
For more information about pipe target parameters, including how to use dynamic path parameters, see Target parameters in the Amazon EventBridge User Guide.
- Pipe
Target Redshift Data Parameters These are custom parameters to be used when the target is a Amazon Redshift cluster to invoke the Amazon Redshift Data API BatchExecuteStatement.
- Pipe
Target Sage Maker Pipeline Parameters The parameters for using a SageMaker pipeline as a target.
- Pipe
Target SqsQueue Parameters The parameters for using a Amazon SQS stream as a target.
- Pipe
Target State Machine Parameters The parameters for using a Step Functions state machine as a target.
- Pipe
Target Timestream Parameters The parameters for using a Timestream for LiveAnalytics table as a target.
- Placement
Constraint An object representing a constraint on task placement. To learn more, see Task Placement Constraints in the Amazon Elastic Container Service Developer Guide.
- Placement
Strategy The task placement strategy for a task or service. To learn more, see Task Placement Strategies in the Amazon Elastic Container Service Service Developer Guide.
- S3Log
Destination The Amazon S3 logging configuration settings for the pipe.
- S3Log
Destination Parameters The Amazon S3 logging configuration settings for the pipe.
- Sage
Maker Pipeline Parameter Name/Value pair of a parameter to start execution of a SageMaker Model Building Pipeline.
- Self
Managed Kafka Access Configuration Vpc This structure specifies the VPC subnets and security groups for the stream, and whether a public IP address is to be used.
- Single
Measure Mapping Maps a single source data field to a single record in the specified Timestream for LiveAnalytics table.
For more information, see Amazon Timestream for LiveAnalytics concepts
- Tag
A key-value pair associated with an Amazon Web Services resource. In EventBridge, rules and event buses support tagging.
- Update
Pipe Source Active MqBroker Parameters The parameters for using an Active MQ broker as a source.
- Update
Pipe Source Dynamo DbStream Parameters The parameters for using a DynamoDB stream as a source.
- Update
Pipe Source Kinesis Stream Parameters The parameters for using a Kinesis stream as a source.
- Update
Pipe Source Managed Streaming Kafka Parameters The parameters for using an MSK stream as a source.
- Update
Pipe Source Parameters The parameters required to set up a source for your pipe.
- Update
Pipe Source Rabbit MqBroker Parameters The parameters for using a Rabbit MQ broker as a source.
- Update
Pipe Source Self Managed Kafka Parameters The parameters for using a self-managed Apache Kafka stream as a source.
A self managed cluster refers to any Apache Kafka cluster not hosted by Amazon Web Services. This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud, CloudKarafka, or Redpanda. For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide.
- Update
Pipe Source SqsQueue Parameters The parameters for using a Amazon SQS stream as a source.
- Validation
Exception Field Indicates that an error has occurred while performing a validate operation.
Enums§
- Assign
Public Ip - When writing a match expression against
AssignPublicIp
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Batch
JobDependency Type - When writing a match expression against
BatchJobDependencyType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Batch
Resource Requirement Type - When writing a match expression against
BatchResourceRequirementType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Dimension
Value Type - When writing a match expression against
DimensionValueType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Dynamo
DbStream Start Position - When writing a match expression against
DynamoDbStreamStartPosition
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - EcsEnvironment
File Type - When writing a match expression against
EcsEnvironmentFileType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - EcsResource
Requirement Type - When writing a match expression against
EcsResourceRequirementType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Epoch
Time Unit - When writing a match expression against
EpochTimeUnit
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Include
Execution Data Option - When writing a match expression against
IncludeExecutionDataOption
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Kinesis
Stream Start Position - When writing a match expression against
KinesisStreamStartPosition
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Launch
Type - When writing a match expression against
LaunchType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - LogLevel
- When writing a match expression against
LogLevel
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Measure
Value Type - When writing a match expression against
MeasureValueType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - MqBroker
Access Credentials The Secrets Manager secret that stores your broker credentials.
- MskAccess
Credentials The Secrets Manager secret that stores your stream credentials.
- MskStart
Position - When writing a match expression against
MskStartPosition
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - OnPartial
Batch Item Failure Streams - When writing a match expression against
OnPartialBatchItemFailureStreams
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Pipe
State - When writing a match expression against
PipeState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Pipe
Target Invocation Type - When writing a match expression against
PipeTargetInvocationType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Placement
Constraint Type - When writing a match expression against
PlacementConstraintType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Placement
Strategy Type - When writing a match expression against
PlacementStrategyType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Propagate
Tags - When writing a match expression against
PropagateTags
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Requested
Pipe State - When writing a match expression against
RequestedPipeState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Requested
Pipe State Describe Response - When writing a match expression against
RequestedPipeStateDescribeResponse
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - S3Output
Format - When writing a match expression against
S3OutputFormat
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Self
Managed Kafka Access Configuration Credentials The Secrets Manager secret that stores your stream credentials.
- Self
Managed Kafka Start Position - When writing a match expression against
SelfManagedKafkaStartPosition
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Time
Field Type - When writing a match expression against
TimeFieldType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.