Flexible thread pool implementations for all your multithreaded processing needs.
Overview
This library includes two thread pool implementations, [Threadpool
] and [OrderedThreadpool
].
- [
Threadpool
] consumes jobs passed to it, processes them, and then yields them back out in the order they finish. - [
OrderedThreadpool
] consumes jobs passed to it, processes them, and then yields them out in the same order they were received in.
Aside from a few small details, they share many of the same features and properties.
Thread pools can be used for many purposes: they can be spawned momentarily to process a single iterator, and then destroyed (although for that purpose, rayon
is likely a better choice); they can be spawned for long-running tasks that require multiple workers to respond, like a webservice or database management system; they can be spun up and reused multiple times to evade the overhead of repeatedly creating and destroying threads to perform repeated parallel work; the list goes on.
Additionally, this library provides a little helper utility [ReduceAsync::reduce_async
], which implements a unique feedback-loop style fully parallelized reducing algorithm. It pairs nicely when combined with the thread pools, so there are a suite of filtering, mapping, and reduction extensions available to make including its use in your code simple and straightforward.
These are all the current iteration extensions implemented in this library:
- [
FilterMapMultithread::filter_map_multithread
] - [
FilterMapAsync::filter_map_async
] - [
FilterMapAsyncUnordered::filter_map_async_unordered
] - [
ReduceAsync::reduce_async
] - [
ReduceAsyncCommutative::reduce_async_commutative
] - [
FilterMapReduceAsync::filter_map_reduce_async
] - [
FilterMapReduceAsyncCommutative::filter_map_reduce_async_commutative
] - [
Pipe::pipe
]
TL;DR: Spawn a thread pool, use it to asynchronously process data, then discard it once you finish. Or don't, I'm not your dad.
Examples
Synchronously process a sequence of elements:
use ;
use *;
scope
Asynchronously process a sequence of elements:
use ;
use ;
use Duration;
use *;
let total = new;
scope
Filter, Map, & Reduce Asynchronously:
use *;
let vals = 0..10000usize;
let sequential_result = vals
.clone
.filter_map
.reduce
.unwrap;
let parallel_result = vals
.filter_map_reduce_async_commutative
.unwrap;
assert_eq!;
Chain mutliple pools together:
use *;
use scope;
scope;
See /tests
for more usage examples.