Struct async_nats::jetstream::consumer::pull::BatchBuilder
source · [−]pub struct BatchBuilder<'a> { /* private fields */ }
Expand description
Used for building configuration for a Batch. Created by a [Consumer::batch_builder] on a Consumer.
Examples
use futures::StreamExt;
use async_nats::jetstream::consumer::PullConsumer;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events").await?
.get_consumer("pull").await?;
let mut messages = consumer.batch()
.max_messages(100)
.max_bytes(1024)
.messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
Implementations
sourceimpl<'a> BatchBuilder<'a>
impl<'a> BatchBuilder<'a>
pub fn new(consumer: &'a Consumer<Config>) -> Self
sourcepub fn max_bytes(self, max_bytes: usize) -> Self
pub fn max_bytes(self, max_bytes: usize) -> Self
Sets max bytes that can be buffered on the Client while processing already received messages. Higher values will yield better performance, but also potentially increase memory usage if application is acknowledging messages much slower than they arrive.
Default values should provide reasonable balance between performance and memory usage.
Examples
use futures::StreamExt;
use async_nats::jetstream::consumer::PullConsumer;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events").await?
.get_consumer("pull").await?;
let mut messages = consumer.batch()
.max_bytes(1024)
.messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub fn max_messages(self, batch: usize) -> Self
pub fn max_messages(self, batch: usize) -> Self
Sets max number of messages that can be buffered on the Client while processing already received messages. Higher values will yield better performance, but also potentially increase memory usage if application is acknowledging messages much slower than they arrive.
Default values should provide reasonable balance between performance and memory usage.
Examples
use futures::StreamExt;
use async_nats::jetstream::consumer::PullConsumer;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events").await?
.get_consumer("pull").await?;
let mut messages = consumer.batch()
.max_messages(100)
.messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub fn hearbeat(self, hearbeat: Duration) -> Self
pub fn hearbeat(self, hearbeat: Duration) -> Self
Sets hearbeat which will be send by the server if there are no messages for a given Consumer pending.
Examples
use futures::StreamExt;
use async_nats::jetstream::consumer::PullConsumer;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events").await?
.get_consumer("pull").await?;
let mut messages = consumer.batch()
.hearbeat(std::time::Duration::from_secs(10))
.messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub fn expires(self, expires: Duration) -> Self
pub fn expires(self, expires: Duration) -> Self
Low level API that does not need tweaking for most use cases. Sets how long each batch request waits for whole batch of messages before timing out. Consumer pending.
Examples
use futures::StreamExt;
use async_nats::jetstream::consumer::PullConsumer;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events").await?
.get_consumer("pull").await?;
let mut messages = consumer.batch()
.expires(std::time::Duration::from_secs(30))
.messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
sourcepub async fn messages(self) -> Result<Batch, Error>
pub async fn messages(self) -> Result<Batch, Error>
Creates actual Stream with provided configuration.
Examples
use futures::StreamExt;
use async_nats::jetstream::consumer::PullConsumer;
let client = async_nats::connect("localhost:4222").await?;
let jetstream = async_nats::jetstream::new(client);
let consumer: PullConsumer = jetstream
.get_stream("events").await?
.get_consumer("pull").await?;
let mut messages = consumer.batch()
.max_messages(100)
.messages().await?;
while let Some(message) = messages.next().await {
let message = message?;
println!("message: {:?}", message);
message.ack().await?;
}
Auto Trait Implementations
impl<'a> !RefUnwindSafe for BatchBuilder<'a>
impl<'a> Send for BatchBuilder<'a>
impl<'a> Sync for BatchBuilder<'a>
impl<'a> Unpin for BatchBuilder<'a>
impl<'a> !UnwindSafe for BatchBuilder<'a>
Blanket Implementations
sourceimpl<T> BorrowMut<T> for T where
T: ?Sized,
impl<T> BorrowMut<T> for T where
T: ?Sized,
const: unstable · sourcefn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more