[−][src]Crate desync
Desync
This is a concurrency library for Rust that protects data by scheduling operations in order instead of locking and blocking threads. It provides a simple API that works well with Rust's notion of lifetimes, alongside a concurrency model with a dramatically reduced set of moving parts.
This approach has several advantages over the traditional method:
- It's simpler: almost the entire set of thread methods and synchronisation primitives can
be replaced with the two fundamental scheduling functions,
sync()
anddesync()
. - There's less boilerplate: code is less about starting threads and sending messages and more literally expresses intent.
- It's easier to reason about: scheduled operations are always performed in the order they're queued so race conditions and similar issues due to out-of-order execution are both much rarer and easier to debug.
- Borrowing and asynchronous code can mix much more seamlessly than in other concurrency models.
- It makes it easier to write highly concurrent code: desync makes moving between performing operations synchronously and asynchronously trivial, with no need to deal with adding code to start threads or communicate between them.
In addition to the two fundamental methods, desync provides methods for generating futures and processing streams.
Quick start
There is a single new synchronisation object: Desync
. You create one like this:
use desync::Desync; let number = Desync::new(0);
It supports two main operations. desync
will schedule a new job for the object that will run
in a background thread. It's useful for deferring long-running operations and moving updates
so they can run in parallel.
let number = Desync::new(0); number.desync(|val| { // Long update here thread::sleep(Duration::from_millis(100)); *val = 42; }); // We can carry on what we're doing with the update now running in the background
The other operation is sync
, which schedules a job to run synchronously on the data structure.
This is useful for retrieving values from a Desync
.
let new_number = number.sync(|val| *val); // = 42
Desync
objects always run operations in the order that is provided, so all operations are
serialized from the point of view of the data that they contain. When combined with the ability
to perform operations asynchronously, this provides a useful way to immediately parallelize
long-running operations.
The future_sync()
action returns a boxed Future that can be used with other libraries that use them. It's
conceptually the same as sync
, except that it doesn't wait for the operation to complete:
let future_number = number.future_sync(|val| future::ready(*val).boxed()); assert!(executor::block_on(async { future_number.await.unwrap() }) == 42 );
Note that this is the equivalent of just number.sync(|val| *val)
, so this is mainly useful for
interacting with other code that's already using futures. The after()
function is also provided
for using the results of futures to update the contents of Desync
data: these all preserve the
strict order-of-operations semantics, so operations scheduled after an after
won't start until
that operation has completed.
Pipes and streams
As well as support for futures, Desync provides supports for streams. The pipe_in()
and pipe()
functions provide a way to process stream data in a desync object as it arrives. pipe_in()
just
processes stream data as it arrives, and pipe()
provides an output stream of data.
pipe()
is quite useful as a way to provide asynchronous access to synchronous code: it can be used
to create a channel to send requests to an asynchronous target and retrieve results back via its
output. (Unlike this 'traditional' method, the actual scheduling and channel maintenance does not
need to be explicitly implemented)
Re-exports
pub use self::desync::*; |
pub use self::pipe::*; |
Modules
desync | The main |
pipe | Desync pipes provide a way to generate and process streams via a |
scheduler | The scheduler provides the |
Enums
TrySyncError | Possible error conditions from a |