Module types

Module types 

Source
Expand description

Data structures used by operation inputs/outputs.

Modules§

builders
Builders
error
Error types that Amazon EventBridge Pipes can respond with.

Structs§

AwsVpcConfiguration

This structure specifies the VPC subnets and security groups for the task, and whether a public IP address is to be used. This structure is relevant only for ECS tasks that use the awsvpc network mode.

BatchArrayProperties

The array properties for the submitted job, such as the size of the array. The array size can be between 2 and 10,000. If you specify array properties for a job, it becomes an array job. This parameter is used only if the target is an Batch job.

BatchContainerOverrides

The overrides that are sent to a container.

BatchEnvironmentVariable

The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition.

Environment variables cannot start with "Batch". This naming convention is reserved for variables that Batch sets.

BatchJobDependency

An object that represents an Batch job dependency.

BatchResourceRequirement

The type and amount of a resource to assign to a container. The supported resources include GPU, MEMORY, and VCPU.

BatchRetryStrategy

The retry strategy that's associated with a job. For more information, see Automated job retries in the Batch User Guide.

CapacityProviderStrategyItem

The details of a capacity provider strategy. To learn more, see CapacityProviderStrategyItem in the Amazon ECS API Reference.

CloudwatchLogsLogDestination

The Amazon CloudWatch Logs logging configuration settings for the pipe.

CloudwatchLogsLogDestinationParameters

The Amazon CloudWatch Logs logging configuration settings for the pipe.

DeadLetterConfig

A DeadLetterConfig object that contains information about a dead-letter queue configuration.

DimensionMapping

Maps source data to a dimension in the target Timestream for LiveAnalytics table.

For more information, see Amazon Timestream for LiveAnalytics concepts

EcsContainerOverride

The overrides that are sent to a container. An empty container override can be passed in. An example of an empty container override is {"containerOverrides": \[ \] }. If a non-empty container override is specified, the name parameter must be included.

EcsEnvironmentFile

A list of files containing the environment variables to pass to a container. You can specify up to ten environment files. The file must have a .env file extension. Each line in an environment file should contain an environment variable in VARIABLE=VALUE format. Lines beginning with # are treated as comments and are ignored. For more information about the environment variable file syntax, see Declare default environment variables in file.

If there are environment variables specified using the environment parameter in a container definition, they take precedence over the variables contained within an environment file. If multiple environment files are specified that contain the same variable, they're processed from the top down. We recommend that you use unique variable names. For more information, see Specifying environment variables in the Amazon Elastic Container Service Developer Guide.

This parameter is only supported for tasks hosted on Fargate using the following platform versions:

  • Linux platform version 1.4.0 or later.

  • Windows platform version 1.0.0 or later.

EcsEnvironmentVariable

The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition. You must also specify a container name.

EcsEphemeralStorage

The amount of ephemeral storage to allocate for the task. This parameter is used to expand the total amount of ephemeral storage available, beyond the default amount, for tasks hosted on Fargate. For more information, see Fargate task storage in the Amazon ECS User Guide for Fargate.

This parameter is only supported for tasks hosted on Fargate using Linux platform version 1.4.0 or later. This parameter is not supported for Windows containers on Fargate.

EcsInferenceAcceleratorOverride

Details on an Elastic Inference accelerator task override. This parameter is used to override the Elastic Inference accelerator specified in the task definition. For more information, see Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide.

EcsResourceRequirement

The type and amount of a resource to assign to a container. The supported resource types are GPUs and Elastic Inference accelerators. For more information, see Working with GPUs on Amazon ECS or Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide

EcsTaskOverride

The overrides that are associated with a task.

Filter

Filter events using an event pattern. For more information, see Events and Event Patterns in the Amazon EventBridge User Guide.

FilterCriteria

The collection of event patterns used to filter events.

To remove a filter, specify a FilterCriteria object with an empty array of Filter objects.

For more information, see Events and Event Patterns in the Amazon EventBridge User Guide.

FirehoseLogDestination

The Amazon Data Firehose logging configuration settings for the pipe.

FirehoseLogDestinationParameters

The Amazon Data Firehose logging configuration settings for the pipe.

MultiMeasureAttributeMapping

A mapping of a source event data field to a measure in a Timestream for LiveAnalytics record.

MultiMeasureMapping

Maps multiple measures from the source event to the same Timestream for LiveAnalytics record.

For more information, see Amazon Timestream for LiveAnalytics concepts

NetworkConfiguration

This structure specifies the network configuration for an Amazon ECS task.

Pipe

An object that represents a pipe. Amazon EventBridgePipes connect event sources to targets and reduces the need for specialized knowledge and integration code.

PipeEnrichmentHttpParameters

These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations. In the latter case, these are merged with any InvocationParameters specified on the Connection, with any values from the Connection taking precedence.

PipeEnrichmentParameters

The parameters required to set up enrichment on your pipe.

PipeLogConfiguration

The logging configuration settings for the pipe.

PipeLogConfigurationParameters

Specifies the logging configuration settings for the pipe.

When you call UpdatePipe, EventBridge updates the fields in the PipeLogConfigurationParameters object atomically as one and overrides existing values. This is by design. If you don't specify an optional field in any of the Amazon Web Services service parameters objects (CloudwatchLogsLogDestinationParameters, FirehoseLogDestinationParameters, or S3LogDestinationParameters), EventBridge sets that field to its system-default value during the update.

For example, suppose when you created the pipe you specified a Firehose stream log destination. You then update the pipe to add an Amazon S3 log destination. In addition to specifying the S3LogDestinationParameters for the new log destination, you must also specify the fields in the FirehoseLogDestinationParameters object in order to retain the Firehose stream log destination.

For more information on generating pipe log records, see Log EventBridge Pipes in the Amazon EventBridge User Guide.

PipeSourceActiveMqBrokerParameters

The parameters for using an Active MQ broker as a source.

PipeSourceDynamoDbStreamParameters

The parameters for using a DynamoDB stream as a source.

PipeSourceKinesisStreamParameters

The parameters for using a Kinesis stream as a source.

PipeSourceManagedStreamingKafkaParameters

The parameters for using an MSK stream as a source.

PipeSourceParameters

The parameters required to set up a source for your pipe.

PipeSourceRabbitMqBrokerParameters

The parameters for using a Rabbit MQ broker as a source.

PipeSourceSelfManagedKafkaParameters

The parameters for using a self-managed Apache Kafka stream as a source.

A self managed cluster refers to any Apache Kafka cluster not hosted by Amazon Web Services. This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud, CloudKarafka, or Redpanda. For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide.

PipeSourceSqsQueueParameters

The parameters for using a Amazon SQS stream as a source.

PipeTargetBatchJobParameters

The parameters for using an Batch job as a target.

PipeTargetCloudWatchLogsParameters

The parameters for using an CloudWatch Logs log stream as a target.

PipeTargetEcsTaskParameters

The parameters for using an Amazon ECS task as a target.

PipeTargetEventBridgeEventBusParameters

The parameters for using an EventBridge event bus as a target.

PipeTargetHttpParameters

These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations.

PipeTargetKinesisStreamParameters

The parameters for using a Kinesis stream as a target.

PipeTargetLambdaFunctionParameters

The parameters for using a Lambda function as a target.

PipeTargetParameters

The parameters required to set up a target for your pipe.

For more information about pipe target parameters, including how to use dynamic path parameters, see Target parameters in the Amazon EventBridge User Guide.

PipeTargetRedshiftDataParameters

These are custom parameters to be used when the target is a Amazon Redshift cluster to invoke the Amazon Redshift Data API BatchExecuteStatement.

PipeTargetSageMakerPipelineParameters

The parameters for using a SageMaker pipeline as a target.

PipeTargetSqsQueueParameters

The parameters for using a Amazon SQS stream as a target.

PipeTargetStateMachineParameters

The parameters for using a Step Functions state machine as a target.

PipeTargetTimestreamParameters

The parameters for using a Timestream for LiveAnalytics table as a target.

PlacementConstraint

An object representing a constraint on task placement. To learn more, see Task Placement Constraints in the Amazon Elastic Container Service Developer Guide.

PlacementStrategy

The task placement strategy for a task or service. To learn more, see Task Placement Strategies in the Amazon Elastic Container Service Service Developer Guide.

S3LogDestination

The Amazon S3 logging configuration settings for the pipe.

S3LogDestinationParameters

The Amazon S3 logging configuration settings for the pipe.

SageMakerPipelineParameter

Name/Value pair of a parameter to start execution of a SageMaker Model Building Pipeline.

SelfManagedKafkaAccessConfigurationVpc

This structure specifies the VPC subnets and security groups for the stream, and whether a public IP address is to be used.

SingleMeasureMapping

Maps a single source data field to a single record in the specified Timestream for LiveAnalytics table.

For more information, see Amazon Timestream for LiveAnalytics concepts

Tag

A key-value pair associated with an Amazon Web Services resource. In EventBridge, rules and event buses support tagging.

UpdatePipeSourceActiveMqBrokerParameters

The parameters for using an Active MQ broker as a source.

UpdatePipeSourceDynamoDbStreamParameters

The parameters for using a DynamoDB stream as a source.

UpdatePipeSourceKinesisStreamParameters

The parameters for using a Kinesis stream as a source.

UpdatePipeSourceManagedStreamingKafkaParameters

The parameters for using an MSK stream as a source.

UpdatePipeSourceParameters

The parameters required to set up a source for your pipe.

UpdatePipeSourceRabbitMqBrokerParameters

The parameters for using a Rabbit MQ broker as a source.

UpdatePipeSourceSelfManagedKafkaParameters

The parameters for using a self-managed Apache Kafka stream as a source.

A self managed cluster refers to any Apache Kafka cluster not hosted by Amazon Web Services. This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud, CloudKarafka, or Redpanda. For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide.

UpdatePipeSourceSqsQueueParameters

The parameters for using a Amazon SQS stream as a source.

ValidationExceptionField

Indicates that an error has occurred while performing a validate operation.

Enums§

AssignPublicIp
When writing a match expression against AssignPublicIp, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
BatchJobDependencyType
When writing a match expression against BatchJobDependencyType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
BatchResourceRequirementType
When writing a match expression against BatchResourceRequirementType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
DimensionValueType
When writing a match expression against DimensionValueType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
DynamoDbStreamStartPosition
When writing a match expression against DynamoDbStreamStartPosition, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
EcsEnvironmentFileType
When writing a match expression against EcsEnvironmentFileType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
EcsResourceRequirementType
When writing a match expression against EcsResourceRequirementType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
EpochTimeUnit
When writing a match expression against EpochTimeUnit, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
IncludeExecutionDataOption
When writing a match expression against IncludeExecutionDataOption, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
KinesisStreamStartPosition
When writing a match expression against KinesisStreamStartPosition, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
LaunchType
When writing a match expression against LaunchType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
LogLevel
When writing a match expression against LogLevel, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
MeasureValueType
When writing a match expression against MeasureValueType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
MqBrokerAccessCredentials

The Secrets Manager secret that stores your broker credentials.

MskAccessCredentials

The Secrets Manager secret that stores your stream credentials.

MskStartPosition
When writing a match expression against MskStartPosition, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
OnPartialBatchItemFailureStreams
When writing a match expression against OnPartialBatchItemFailureStreams, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
PipeState
When writing a match expression against PipeState, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
PipeTargetInvocationType
When writing a match expression against PipeTargetInvocationType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
PlacementConstraintType
When writing a match expression against PlacementConstraintType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
PlacementStrategyType
When writing a match expression against PlacementStrategyType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
PropagateTags
When writing a match expression against PropagateTags, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
RequestedPipeState
When writing a match expression against RequestedPipeState, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
RequestedPipeStateDescribeResponse
When writing a match expression against RequestedPipeStateDescribeResponse, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
S3OutputFormat
When writing a match expression against S3OutputFormat, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
SelfManagedKafkaAccessConfigurationCredentials

The Secrets Manager secret that stores your stream credentials.

SelfManagedKafkaStartPosition
When writing a match expression against SelfManagedKafkaStartPosition, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.
TimeFieldType
When writing a match expression against TimeFieldType, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.