Struct aws_sdk_databasemigration::model::kafka_settings::Builder
source · pub struct Builder { /* private fields */ }
Expand description
A builder for KafkaSettings
.
Implementations§
source§impl Builder
impl Builder
sourcepub fn broker(self, input: impl Into<String>) -> Self
pub fn broker(self, input: impl Into<String>) -> Self
A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port
. For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345"
. For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.
sourcepub fn set_broker(self, input: Option<String>) -> Self
pub fn set_broker(self, input: Option<String>) -> Self
A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port
. For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345"
. For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.
sourcepub fn topic(self, input: impl Into<String>) -> Self
pub fn topic(self, input: impl Into<String>) -> Self
The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic"
as the migration topic.
sourcepub fn set_topic(self, input: Option<String>) -> Self
pub fn set_topic(self, input: Option<String>) -> Self
The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic"
as the migration topic.
sourcepub fn message_format(self, input: MessageFormatValue) -> Self
pub fn message_format(self, input: MessageFormatValue) -> Self
The output format for the records created on the endpoint. The message format is JSON
(default) or JSON_UNFORMATTED
(a single line with no tab).
sourcepub fn set_message_format(self, input: Option<MessageFormatValue>) -> Self
pub fn set_message_format(self, input: Option<MessageFormatValue>) -> Self
The output format for the records created on the endpoint. The message format is JSON
(default) or JSON_UNFORMATTED
(a single line with no tab).
sourcepub fn include_transaction_details(self, input: bool) -> Self
pub fn include_transaction_details(self, input: bool) -> Self
Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id
, previous transaction_id
, and transaction_record_id
(the record offset within a transaction). The default is false
.
sourcepub fn set_include_transaction_details(self, input: Option<bool>) -> Self
pub fn set_include_transaction_details(self, input: Option<bool>) -> Self
Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id
, previous transaction_id
, and transaction_record_id
(the record offset within a transaction). The default is false
.
sourcepub fn include_partition_value(self, input: bool) -> Self
pub fn include_partition_value(self, input: bool) -> Self
Shows the partition value within the Kafka message output unless the partition type is schema-table-type
. The default is false
.
sourcepub fn set_include_partition_value(self, input: Option<bool>) -> Self
pub fn set_include_partition_value(self, input: Option<bool>) -> Self
Shows the partition value within the Kafka message output unless the partition type is schema-table-type
. The default is false
.
sourcepub fn partition_include_schema_table(self, input: bool) -> Self
pub fn partition_include_schema_table(self, input: bool) -> Self
Prefixes schema and table names to partition values, when the partition type is primary-key-type
. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false
.
sourcepub fn set_partition_include_schema_table(self, input: Option<bool>) -> Self
pub fn set_partition_include_schema_table(self, input: Option<bool>) -> Self
Prefixes schema and table names to partition values, when the partition type is primary-key-type
. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false
.
sourcepub fn include_table_alter_operations(self, input: bool) -> Self
pub fn include_table_alter_operations(self, input: bool) -> Self
Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table
, drop-table
, add-column
, drop-column
, and rename-column
. The default is false
.
sourcepub fn set_include_table_alter_operations(self, input: Option<bool>) -> Self
pub fn set_include_table_alter_operations(self, input: Option<bool>) -> Self
Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table
, drop-table
, add-column
, drop-column
, and rename-column
. The default is false
.
sourcepub fn include_control_details(self, input: bool) -> Self
pub fn include_control_details(self, input: bool) -> Self
Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false
.
sourcepub fn set_include_control_details(self, input: Option<bool>) -> Self
pub fn set_include_control_details(self, input: Option<bool>) -> Self
Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false
.
sourcepub fn message_max_bytes(self, input: i32) -> Self
pub fn message_max_bytes(self, input: i32) -> Self
The maximum size in bytes for records created on the endpoint The default is 1,000,000.
sourcepub fn set_message_max_bytes(self, input: Option<i32>) -> Self
pub fn set_message_max_bytes(self, input: Option<i32>) -> Self
The maximum size in bytes for records created on the endpoint The default is 1,000,000.
sourcepub fn include_null_and_empty(self, input: bool) -> Self
pub fn include_null_and_empty(self, input: bool) -> Self
Include NULL and empty columns for records migrated to the endpoint. The default is false
.
sourcepub fn set_include_null_and_empty(self, input: Option<bool>) -> Self
pub fn set_include_null_and_empty(self, input: Option<bool>) -> Self
Include NULL and empty columns for records migrated to the endpoint. The default is false
.
sourcepub fn security_protocol(self, input: KafkaSecurityProtocol) -> Self
pub fn security_protocol(self, input: KafkaSecurityProtocol) -> Self
Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption
, ssl-authentication
, and sasl-ssl
. sasl-ssl
requires SaslUsername
and SaslPassword
.
sourcepub fn set_security_protocol(self, input: Option<KafkaSecurityProtocol>) -> Self
pub fn set_security_protocol(self, input: Option<KafkaSecurityProtocol>) -> Self
Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption
, ssl-authentication
, and sasl-ssl
. sasl-ssl
requires SaslUsername
and SaslPassword
.
sourcepub fn ssl_client_certificate_arn(self, input: impl Into<String>) -> Self
pub fn ssl_client_certificate_arn(self, input: impl Into<String>) -> Self
The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.
sourcepub fn set_ssl_client_certificate_arn(self, input: Option<String>) -> Self
pub fn set_ssl_client_certificate_arn(self, input: Option<String>) -> Self
The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.
sourcepub fn ssl_client_key_arn(self, input: impl Into<String>) -> Self
pub fn ssl_client_key_arn(self, input: impl Into<String>) -> Self
The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.
sourcepub fn set_ssl_client_key_arn(self, input: Option<String>) -> Self
pub fn set_ssl_client_key_arn(self, input: Option<String>) -> Self
The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.
sourcepub fn ssl_client_key_password(self, input: impl Into<String>) -> Self
pub fn ssl_client_key_password(self, input: impl Into<String>) -> Self
The password for the client private key used to securely connect to a Kafka target endpoint.
sourcepub fn set_ssl_client_key_password(self, input: Option<String>) -> Self
pub fn set_ssl_client_key_password(self, input: Option<String>) -> Self
The password for the client private key used to securely connect to a Kafka target endpoint.
sourcepub fn ssl_ca_certificate_arn(self, input: impl Into<String>) -> Self
pub fn ssl_ca_certificate_arn(self, input: impl Into<String>) -> Self
The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.
sourcepub fn set_ssl_ca_certificate_arn(self, input: Option<String>) -> Self
pub fn set_ssl_ca_certificate_arn(self, input: Option<String>) -> Self
The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.
sourcepub fn sasl_username(self, input: impl Into<String>) -> Self
pub fn sasl_username(self, input: impl Into<String>) -> Self
The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
sourcepub fn set_sasl_username(self, input: Option<String>) -> Self
pub fn set_sasl_username(self, input: Option<String>) -> Self
The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
sourcepub fn sasl_password(self, input: impl Into<String>) -> Self
pub fn sasl_password(self, input: impl Into<String>) -> Self
The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
sourcepub fn set_sasl_password(self, input: Option<String>) -> Self
pub fn set_sasl_password(self, input: Option<String>) -> Self
The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
sourcepub fn no_hex_prefix(self, input: bool) -> Self
pub fn no_hex_prefix(self, input: bool) -> Self
Set this optional parameter to true
to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix
endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.
sourcepub fn set_no_hex_prefix(self, input: Option<bool>) -> Self
pub fn set_no_hex_prefix(self, input: Option<bool>) -> Self
Set this optional parameter to true
to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix
endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.
sourcepub fn build(self) -> KafkaSettings
pub fn build(self) -> KafkaSettings
Consumes the builder and constructs a KafkaSettings
.