jetstream-extra

Set of utilities and extensions for the JetStream NATS of the async-nats crate.
Features
- Batch Publishing - Atomic batch publishing ensuring all-or-nothing message storage
- Fast Ingest Batch Publishing - High-throughput, non-atomic batch publishing with server-driven flow control (requires nats-server 2.14+)
- Batch Fetching - Efficient multi-message retrieval using DIRECT.GET API
Batch Publishing
Atomic batch publishing implementation for JetStream streams, ensuring that either all messages in a batch are stored or none are.
Complete example
Connect to NATS server with JetStream, and extend the jetstream context with the batch publishing capabilities.
use async_nats::jetstream;
use jetstream_extra::batch_publish::BatchPublishExt;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = async_nats::connect("demo.nats.io").await?;
let jetstream = jetstream::new(client);
let _stream = jetstream.get_or_create_stream(jetstream::stream::Config {
name: "events".to_string(),
subjects: vec!["events.*".to_string()],
allow_atomic_publish: true,
..Default::default()
}).await?;
let mut batch = jetstream.batch_publish().build();
batch.add("events.order", "order-123".into()).await?;
batch.add("events.payment", "payment-456".into()).await?;
batch.add("events.inventory", "item-789".into()).await?;
let ack = batch.commit("events.notification", "notify-complete".into()).await?;
println!("Batch published with sequence: {}", ack.sequence);
Ok(())
}
Fast Ingest Batch Publishing
High-throughput, non-atomic batch publishing using JetStream's fast-ingest feature (ADR-50, requires nats-server 2.14 or later). Unlike atomic batch publishing, messages are persisted as they arrive and the server uses a flow-control channel to coordinate throughput across concurrent publishers.
Use fast ingest when:
- You need to ship millions of messages per batch and don't need all-or-nothing semantics.
- Throughput matters more than atomicity.
- You want the server to dynamically tune ack frequency based on load.
The stream must have allow_batched: true. The publisher owns a dedicated inbox subscription for the duration of the batch and drives ack handling inline — no background task, no locks.
Complete example
use async_nats::jetstream;
use jetstream_extra::batch_publish_fast::{FastPublishExt, GapMode};
use std::time::Duration;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = async_nats::connect("nats://127.0.0.1:4222").await?;
let jetstream = jetstream::new(client);
let mut batch = jetstream
.fast_publish()
.flow(100) .max_outstanding_acks(2) .gap_mode(GapMode::Fail) .ack_timeout(Duration::from_secs(10))
.on_error(|e| eprintln!("fast publish event: {e}"))
.build()?;
for i in 0..10_000 {
batch.add("metrics.cpu", format!("sample {i}").into()).await?;
}
let ack = batch.close().await?;
println!("committed {} messages as batch {}", ack.batch_size, ack.batch_id);
Ok(())
}
See examples/fast_publisher.rs for a runnable example.
Batch Fetching
Efficient batch fetching of messages from JetStream streams using the DIRECT.GET API, supporting:
- Fetching multiple messages in a single request
- Subject filtering with wildcards
- Sequence and time-based ranges
- Multi-subject last message queries
Fetch a batch of messages
use async_nats::jetstream;
use jetstream_extra::batch_fetch::BatchFetchExt;
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = async_nats::connect("demo.nats.io").await?;
let context = jetstream::new(client);
let mut messages = context
.get_batch("my_stream", 100)
.send()
.await?;
while let Some(msg) = messages.next().await {
let msg = msg?;
println!("Message at seq {}: {:?}", msg.sequence, msg.subject);
}
Ok(())
}
Get last messages for multiple subjects
use async_nats::jetstream;
use jetstream_extra::batch_fetch::BatchFetchExt;
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = async_nats::connect("demo.nats.io").await?;
let context = jetstream::new(client);
let subjects = vec![
"sensors.temp".to_string(),
"sensors.humidity".to_string(),
"sensors.pressure".to_string(),
];
let mut messages = context
.get_last_messages_for("sensor_stream")
.subjects(subjects)
.send()
.await?;
while let Some(msg) = messages.next().await {
let msg = msg?;
println!("Last value for {}: {:?}", msg.subject, msg.payload);
}
Ok(())
}