Trait BatchedQueueTrait

Source
pub trait BatchedQueueTrait<T> {
    // Required methods
    fn len(&self) -> usize;
    fn capacity(&self) -> usize;
    fn is_empty(&self) -> bool;
    fn push(&self, item: T) -> Result<(), BatchedQueueError>;
    fn try_next_batch(&self) -> Result<Option<Vec<T>>, BatchedQueueError>;
    fn next_batch(&self) -> Result<Vec<T>, BatchedQueueError>;
    fn next_batch_timeout(
        &self,
        timeout: Duration,
    ) -> Result<Vec<T>, BatchedQueueError>;
    fn flush(&self) -> Result<(), BatchedQueueError>;
}
Expand description

Defines the common interface for batched queue implementations.

This trait provides methods for adding items to a queue, retrieving batches of items, and checking queue status. All implementations must handle the buffering of items until they form complete batches, and provide mechanisms for flushing partial batches when needed.

§Examples

use batched_queue::{BatchedQueue, BatchedQueueTrait};

// Create a queue with batch size of 10
let queue = BatchedQueue::new(10).expect("Failed to create queue");

// Create a sender that can be shared across threads
let sender = queue.create_sender();

// Push items to the queue (in real usage, this would be in another thread)
for i in 0..25 {
    sender.push(i).expect("Failed to push item");
}

// Flush any remaining items that haven't formed a complete batch
sender.flush().expect("Failed to flush");

// Process batches
while let Some(batch) = queue.try_next_batch().expect("Failed to get batch") {
    println!("Processing batch of {} items", batch.len());
    for item in batch {
        // Process each item
        println!("  Item: {}", item);
    }
}

Required Methods§

Source

fn len(&self) -> usize

Returns the current number of items in the queue.

Source

fn capacity(&self) -> usize

Returns the maximum number of items a batch can hold.

Source

fn is_empty(&self) -> bool

Returns true if the queue has no items waiting to be processed.

Source

fn push(&self, item: T) -> Result<(), BatchedQueueError>

Adds an item to the queue.

If adding this item causes the current batch to reach the configured batch size, the batch will be automatically sent for processing.

§Arguments
  • item - The item to add to the queue
§Errors

Returns BatchedQueueError::Disconnected if the receiving end has been dropped, or other implementation-specific errors.

Source

fn try_next_batch(&self) -> Result<Option<Vec<T>>, BatchedQueueError>

Attempts to retrieve the next batch of items without blocking.

§Returns
  • Ok(Some(batch)) - A batch of items is available
  • Ok(None) - No batch is currently available
§Errors

Returns BatchedQueueError::Disconnected if the sending end has been dropped, or other implementation-specific errors.

Source

fn next_batch(&self) -> Result<Vec<T>, BatchedQueueError>

Retrieves the next batch of items, blocking until one is available.

§Errors

Returns BatchedQueueError::Disconnected if the sending end has been dropped, or other implementation-specific errors.

Source

fn next_batch_timeout( &self, timeout: Duration, ) -> Result<Vec<T>, BatchedQueueError>

Retrieves the next batch of items, blocking until one is available or until the timeout expires.

§Arguments
  • timeout - Maximum time to wait for a batch to become available
§Errors

Returns:

  • BatchedQueueError::Timeout if no batch becomes available within the timeout period
  • BatchedQueueError::Disconnected if the sending end has been dropped
  • Other implementation-specific errors
Source

fn flush(&self) -> Result<(), BatchedQueueError>

Flushes any pending items into a batch, even if the batch is not full.

This is useful for ensuring that all items are processed, especially during shutdown or when batches need to be processed on demand.

§Errors

Returns BatchedQueueError::Disconnected if the receiving end has been dropped, or other implementation-specific errors.

Implementors§

Source§

impl<T: Send + 'static> BatchedQueueTrait<T> for BatchedQueue<T>