Struct aws_sdk_lambda::input::create_event_source_mapping_input::Builder[][src]

#[non_exhaustive]
pub struct Builder { /* fields omitted */ }
Expand description

Implementations

The Amazon Resource Name (ARN) of the event source.

  • Amazon Kinesis - The ARN of the data stream or a stream consumer.

  • Amazon DynamoDB Streams - The ARN of the stream.

  • Amazon Simple Queue Service - The ARN of the queue.

  • Amazon Managed Streaming for Apache Kafka - The ARN of the cluster.

The Amazon Resource Name (ARN) of the event source.

  • Amazon Kinesis - The ARN of the data stream or a stream consumer.

  • Amazon DynamoDB Streams - The ARN of the stream.

  • Amazon Simple Queue Service - The ARN of the queue.

  • Amazon Managed Streaming for Apache Kafka - The ARN of the cluster.

The name of the Lambda function.

Name formats

  • Function name - MyFunction.

  • Function ARN - arn:aws:lambda:us-west-2:123456789012:function:MyFunction.

  • Version or Alias ARN - arn:aws:lambda:us-west-2:123456789012:function:MyFunction:PROD.

  • Partial ARN - 123456789012:function:MyFunction.

The length constraint applies only to the full ARN. If you specify only the function name, it's limited to 64 characters in length.

The name of the Lambda function.

Name formats

  • Function name - MyFunction.

  • Function ARN - arn:aws:lambda:us-west-2:123456789012:function:MyFunction.

  • Version or Alias ARN - arn:aws:lambda:us-west-2:123456789012:function:MyFunction:PROD.

  • Partial ARN - 123456789012:function:MyFunction.

The length constraint applies only to the full ARN. If you specify only the function name, it's limited to 64 characters in length.

When true, the event source mapping is active. When false, Lambda pauses polling and invocation.

Default: True

When true, the event source mapping is active. When false, Lambda pauses polling and invocation.

Default: True

The maximum number of records in each batch that Lambda pulls from your stream or queue and sends to your function. Lambda passes all of the records in the batch to the function in a single call, up to the payload limit for synchronous invocation (6 MB).

  • Amazon Kinesis - Default 100. Max 10,000.

  • Amazon DynamoDB Streams - Default 100. Max 1,000.

  • Amazon Simple Queue Service - Default 10. For standard queues the max is 10,000. For FIFO queues the max is 10.

  • Amazon Managed Streaming for Apache Kafka - Default 100. Max 10,000.

  • Self-Managed Apache Kafka - Default 100. Max 10,000.

The maximum number of records in each batch that Lambda pulls from your stream or queue and sends to your function. Lambda passes all of the records in the batch to the function in a single call, up to the payload limit for synchronous invocation (6 MB).

  • Amazon Kinesis - Default 100. Max 10,000.

  • Amazon DynamoDB Streams - Default 100. Max 1,000.

  • Amazon Simple Queue Service - Default 10. For standard queues the max is 10,000. For FIFO queues the max is 10.

  • Amazon Managed Streaming for Apache Kafka - Default 100. Max 10,000.

  • Self-Managed Apache Kafka - Default 100. Max 10,000.

(Streams and Amazon SQS standard queues) The maximum amount of time, in seconds, that Lambda spends gathering records before invoking the function.

Default: 0

Related setting: When you set BatchSize to a value greater than 10, you must set MaximumBatchingWindowInSeconds to at least 1.

(Streams and Amazon SQS standard queues) The maximum amount of time, in seconds, that Lambda spends gathering records before invoking the function.

Default: 0

Related setting: When you set BatchSize to a value greater than 10, you must set MaximumBatchingWindowInSeconds to at least 1.

(Streams only) The number of batches to process from each shard concurrently.

(Streams only) The number of batches to process from each shard concurrently.

The position in a stream from which to start reading. Required for Amazon Kinesis, Amazon DynamoDB, and Amazon MSK Streams sources. AT_TIMESTAMP is only supported for Amazon Kinesis streams.

The position in a stream from which to start reading. Required for Amazon Kinesis, Amazon DynamoDB, and Amazon MSK Streams sources. AT_TIMESTAMP is only supported for Amazon Kinesis streams.

With StartingPosition set to AT_TIMESTAMP, the time from which to start reading.

With StartingPosition set to AT_TIMESTAMP, the time from which to start reading.

(Streams only) An Amazon SQS queue or Amazon SNS topic destination for discarded records.

(Streams only) An Amazon SQS queue or Amazon SNS topic destination for discarded records.

(Streams only) Discard records older than the specified age. The default value is infinite (-1).

(Streams only) Discard records older than the specified age. The default value is infinite (-1).

(Streams only) If the function returns an error, split the batch in two and retry.

(Streams only) If the function returns an error, split the batch in two and retry.

(Streams only) Discard records after the specified number of retries. The default value is infinite (-1). When set to infinite (-1), failed records will be retried until the record expires.

(Streams only) Discard records after the specified number of retries. The default value is infinite (-1). When set to infinite (-1), failed records will be retried until the record expires.

(Streams only) The duration in seconds of a processing window. The range is between 1 second up to 900 seconds.

(Streams only) The duration in seconds of a processing window. The range is between 1 second up to 900 seconds.

Appends an item to topics.

To override the contents of this collection use set_topics.

The name of the Kafka topic.

The name of the Kafka topic.

Appends an item to queues.

To override the contents of this collection use set_queues.

(MQ) The name of the Amazon MQ broker destination queue to consume.

(MQ) The name of the Amazon MQ broker destination queue to consume.

Appends an item to source_access_configurations.

To override the contents of this collection use set_source_access_configurations.

An array of authentication protocols or VPC components required to secure your event source.

An array of authentication protocols or VPC components required to secure your event source.

The Self-Managed Apache Kafka cluster to send records.

The Self-Managed Apache Kafka cluster to send records.

Appends an item to function_response_types.

To override the contents of this collection use set_function_response_types.

(Streams only) A list of current response type enums applied to the event source mapping.

(Streams only) A list of current response type enums applied to the event source mapping.

Consumes the builder and constructs a CreateEventSourceMappingInput

Trait Implementations

Returns a copy of the value. Read more

Performs copy-assignment from source. Read more

Formats the value using the given formatter. Read more

Returns the “default value” for a type. Read more

This method tests for self and other values to be equal, and is used by ==. Read more

This method tests for !=.

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Performs the conversion.

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more

Instruments this type with the current Span, returning an Instrumented wrapper. Read more

Performs the conversion.

The resulting type after obtaining ownership.

Creates owned data from borrowed data, usually by cloning. Read more

🔬 This is a nightly-only experimental API. (toowned_clone_into)

recently added

Uses borrowed data to replace owned data, usually by cloning. Read more

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.

Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more

Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more