batched-queue
A high-performance, highly-concurrent batched queue implementation for Rust.
Overview
batched-queue provides an efficient way to collect individual items into batches for processing, which can significantly improve throughput in high-volume systems. The library offers both synchronous and asynchronous implementations, making it suitable for a wide range of applications.
Features
- Automatic Batching: Collects individual items into batches of configurable size
- Multiple Implementations:
- Synchronous (default) using
parking_lotandcrossbeam-channel - Asynchronous using
tokio(via feature flag)
- Synchronous (default) using
- Thread-safe: Designed for concurrent usage with multiple producers and consumers
- Backpressure Control: Optional bounded queue to manage memory usage
- Flexible Retrieval: Blocking, non-blocking, and timeout-based batch retrieval methods
- High Performance: Optimized for low contention and high throughput
Installation
Add batched-queue to your Cargo.toml:
[]
= "0.1.0"
Feature Flags
sync(default): Enables the synchronous implementation usingparking_lotandcrossbeam-channelasync: Enables the asynchronous implementation usingtokio
To use the async implementation:
[]
= { = "0.1.0", = false, = ["async"] }
Usage Examples
Basic Usage
use ;
Multi-threaded Usage
use ;
use thread;
use Duration;
Bounded Queue with Backpressure
use ;
Performance
batched-queue is designed for high performance in concurrent environments:
- Optimized for minimal lock contention
- Uses efficient lock-free algorithms where possible, highly concurrent otherwise
- Can achieve millions of items per second on modern hardware
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.