#[non_exhaustive]pub struct KafkaStreamingSourceOptions {Show 19 fields
pub bootstrap_servers: Option<String>,
pub security_protocol: Option<String>,
pub connection_name: Option<String>,
pub topic_name: Option<String>,
pub assign: Option<String>,
pub subscribe_pattern: Option<String>,
pub classification: Option<String>,
pub delimiter: Option<String>,
pub starting_offsets: Option<String>,
pub ending_offsets: Option<String>,
pub poll_timeout_ms: Option<i64>,
pub num_retries: Option<i32>,
pub retry_interval_ms: Option<i64>,
pub max_offsets_per_trigger: Option<i64>,
pub min_partitions: Option<i32>,
pub include_headers: Option<bool>,
pub add_record_timestamp: Option<String>,
pub emit_consumer_lag_metrics: Option<String>,
pub starting_timestamp: Option<DateTime>,
}
Expand description
Additional options for streaming.
Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. }
syntax; cannot be matched against without a wildcard ..
; and struct update syntax will not work.bootstrap_servers: Option<String>
A list of bootstrap server URLs, for example, as b-1.vpc-test-2.o4q88o.c6.kafka.us-east-1.amazonaws.com:9094
. This option must be specified in the API call or defined in the table metadata in the Data Catalog.
security_protocol: Option<String>
The protocol used to communicate with brokers. The possible values are "SSL"
or "PLAINTEXT"
.
connection_name: Option<String>
The name of the connection.
topic_name: Option<String>
The topic name as specified in Apache Kafka. You must specify at least one of "topicName"
, "assign"
or "subscribePattern"
.
assign: Option<String>
The specific TopicPartitions
to consume. You must specify at least one of "topicName"
, "assign"
or "subscribePattern"
.
subscribe_pattern: Option<String>
A Java regex string that identifies the topic list to subscribe to. You must specify at least one of "topicName"
, "assign"
or "subscribePattern"
.
classification: Option<String>
An optional classification.
delimiter: Option<String>
Specifies the delimiter character.
starting_offsets: Option<String>
The starting position in the Kafka topic to read data from. The possible values are "earliest"
or "latest"
. The default value is "latest"
.
ending_offsets: Option<String>
The end point when a batch query is ended. Possible values are either "latest"
or a JSON string that specifies an ending offset for each TopicPartition
.
poll_timeout_ms: Option<i64>
The timeout in milliseconds to poll data from Kafka in Spark job executors. The default value is 512
.
num_retries: Option<i32>
The number of times to retry before failing to fetch Kafka offsets. The default value is 3
.
retry_interval_ms: Option<i64>
The time in milliseconds to wait before retrying to fetch Kafka offsets. The default value is 10
.
max_offsets_per_trigger: Option<i64>
The rate limit on the maximum number of offsets that are processed per trigger interval. The specified total number of offsets is proportionally split across topicPartitions
of different volumes. The default value is null, which means that the consumer reads all offsets until the known latest offset.
min_partitions: Option<i32>
The desired minimum number of partitions to read from Kafka. The default value is null, which means that the number of spark partitions is equal to the number of Kafka partitions.
include_headers: Option<bool>
Whether to include the Kafka headers. When the option is set to "true", the data output will contain an additional column named "glue_streaming_kafka_headers" with type Array\[Struct(key: String, value: String)\]
. The default value is "false". This option is available in Glue version 3.0 or later only.
add_record_timestamp: Option<String>
When this option is set to 'true', the data output will contain an additional column named "__src_timestamp" that indicates the time when the corresponding record received by the topic. The default value is 'false'. This option is supported in Glue version 4.0 or later.
emit_consumer_lag_metrics: Option<String>
When this option is set to 'true', for each batch, it will emit the metrics for the duration between the oldest record received by the topic and the time it arrives in Glue to CloudWatch. The metric's name is "glue.driver.streaming.maxConsumerLagInMs". The default value is 'false'. This option is supported in Glue version 4.0 or later.
starting_timestamp: Option<DateTime>
The timestamp of the record in the Kafka topic to start reading data from. The possible values are a timestamp string in UTC format of the pattern yyyy-mm-ddTHH:MM:SSZ
(where Z represents a UTC timezone offset with a +/-. For example: "2023-04-04T08:00:00+08:00").
Only one of StartingTimestamp
or StartingOffsets
must be set.
Implementations§
Source§impl KafkaStreamingSourceOptions
impl KafkaStreamingSourceOptions
Sourcepub fn bootstrap_servers(&self) -> Option<&str>
pub fn bootstrap_servers(&self) -> Option<&str>
A list of bootstrap server URLs, for example, as b-1.vpc-test-2.o4q88o.c6.kafka.us-east-1.amazonaws.com:9094
. This option must be specified in the API call or defined in the table metadata in the Data Catalog.
Sourcepub fn security_protocol(&self) -> Option<&str>
pub fn security_protocol(&self) -> Option<&str>
The protocol used to communicate with brokers. The possible values are "SSL"
or "PLAINTEXT"
.
Sourcepub fn connection_name(&self) -> Option<&str>
pub fn connection_name(&self) -> Option<&str>
The name of the connection.
Sourcepub fn topic_name(&self) -> Option<&str>
pub fn topic_name(&self) -> Option<&str>
The topic name as specified in Apache Kafka. You must specify at least one of "topicName"
, "assign"
or "subscribePattern"
.
Sourcepub fn assign(&self) -> Option<&str>
pub fn assign(&self) -> Option<&str>
The specific TopicPartitions
to consume. You must specify at least one of "topicName"
, "assign"
or "subscribePattern"
.
Sourcepub fn subscribe_pattern(&self) -> Option<&str>
pub fn subscribe_pattern(&self) -> Option<&str>
A Java regex string that identifies the topic list to subscribe to. You must specify at least one of "topicName"
, "assign"
or "subscribePattern"
.
Sourcepub fn classification(&self) -> Option<&str>
pub fn classification(&self) -> Option<&str>
An optional classification.
Sourcepub fn starting_offsets(&self) -> Option<&str>
pub fn starting_offsets(&self) -> Option<&str>
The starting position in the Kafka topic to read data from. The possible values are "earliest"
or "latest"
. The default value is "latest"
.
Sourcepub fn ending_offsets(&self) -> Option<&str>
pub fn ending_offsets(&self) -> Option<&str>
The end point when a batch query is ended. Possible values are either "latest"
or a JSON string that specifies an ending offset for each TopicPartition
.
Sourcepub fn poll_timeout_ms(&self) -> Option<i64>
pub fn poll_timeout_ms(&self) -> Option<i64>
The timeout in milliseconds to poll data from Kafka in Spark job executors. The default value is 512
.
Sourcepub fn num_retries(&self) -> Option<i32>
pub fn num_retries(&self) -> Option<i32>
The number of times to retry before failing to fetch Kafka offsets. The default value is 3
.
Sourcepub fn retry_interval_ms(&self) -> Option<i64>
pub fn retry_interval_ms(&self) -> Option<i64>
The time in milliseconds to wait before retrying to fetch Kafka offsets. The default value is 10
.
Sourcepub fn max_offsets_per_trigger(&self) -> Option<i64>
pub fn max_offsets_per_trigger(&self) -> Option<i64>
The rate limit on the maximum number of offsets that are processed per trigger interval. The specified total number of offsets is proportionally split across topicPartitions
of different volumes. The default value is null, which means that the consumer reads all offsets until the known latest offset.
Sourcepub fn min_partitions(&self) -> Option<i32>
pub fn min_partitions(&self) -> Option<i32>
The desired minimum number of partitions to read from Kafka. The default value is null, which means that the number of spark partitions is equal to the number of Kafka partitions.
Sourcepub fn include_headers(&self) -> Option<bool>
pub fn include_headers(&self) -> Option<bool>
Whether to include the Kafka headers. When the option is set to "true", the data output will contain an additional column named "glue_streaming_kafka_headers" with type Array\[Struct(key: String, value: String)\]
. The default value is "false". This option is available in Glue version 3.0 or later only.
Sourcepub fn add_record_timestamp(&self) -> Option<&str>
pub fn add_record_timestamp(&self) -> Option<&str>
When this option is set to 'true', the data output will contain an additional column named "__src_timestamp" that indicates the time when the corresponding record received by the topic. The default value is 'false'. This option is supported in Glue version 4.0 or later.
Sourcepub fn emit_consumer_lag_metrics(&self) -> Option<&str>
pub fn emit_consumer_lag_metrics(&self) -> Option<&str>
When this option is set to 'true', for each batch, it will emit the metrics for the duration between the oldest record received by the topic and the time it arrives in Glue to CloudWatch. The metric's name is "glue.driver.streaming.maxConsumerLagInMs". The default value is 'false'. This option is supported in Glue version 4.0 or later.
Sourcepub fn starting_timestamp(&self) -> Option<&DateTime>
pub fn starting_timestamp(&self) -> Option<&DateTime>
The timestamp of the record in the Kafka topic to start reading data from. The possible values are a timestamp string in UTC format of the pattern yyyy-mm-ddTHH:MM:SSZ
(where Z represents a UTC timezone offset with a +/-. For example: "2023-04-04T08:00:00+08:00").
Only one of StartingTimestamp
or StartingOffsets
must be set.
Source§impl KafkaStreamingSourceOptions
impl KafkaStreamingSourceOptions
Sourcepub fn builder() -> KafkaStreamingSourceOptionsBuilder
pub fn builder() -> KafkaStreamingSourceOptionsBuilder
Creates a new builder-style object to manufacture KafkaStreamingSourceOptions
.
Trait Implementations§
Source§impl Clone for KafkaStreamingSourceOptions
impl Clone for KafkaStreamingSourceOptions
Source§fn clone(&self) -> KafkaStreamingSourceOptions
fn clone(&self) -> KafkaStreamingSourceOptions
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moreSource§impl Debug for KafkaStreamingSourceOptions
impl Debug for KafkaStreamingSourceOptions
impl StructuralPartialEq for KafkaStreamingSourceOptions
Auto Trait Implementations§
impl Freeze for KafkaStreamingSourceOptions
impl RefUnwindSafe for KafkaStreamingSourceOptions
impl Send for KafkaStreamingSourceOptions
impl Sync for KafkaStreamingSourceOptions
impl Unpin for KafkaStreamingSourceOptions
impl UnwindSafe for KafkaStreamingSourceOptions
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the foreground set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red()
and
green()
, which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg()
:
use yansi::{Paint, Color};
painted.fg(Color::White);
Set foreground color to white using white()
.
use yansi::Paint;
painted.white();
Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the background set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red()
and
on_green()
, which have the same functionality but
are pithier.
§Example
Set background color to red using fg()
:
use yansi::{Paint, Color};
painted.bg(Color::Red);
Set background color to red using on_red()
.
use yansi::Paint;
painted.on_red();
Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute
value
.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold()
and
underline()
, which have the same functionality
but are pithier.
§Example
Make text bold using attr()
:
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);
Make text bold using using bold()
.
use yansi::Paint;
painted.bold();
Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi
Quirk
value
.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask()
and
wrap()
, which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk()
:
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);
Enable wrapping using wrap()
.
use yansi::Paint;
painted.wrap();
Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition
value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted
only when both stdout
and stderr
are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);