[][src]Crate timely

Timely dataflow is framework for managing and executing data-parallel dataflow computations.

The code is organized in crates and modules that are meant to depend as little as possible on each other.

Serialization: The abomonation crate contains simple and highly unsafe serialization routines.

Communication: The timely_communication crate defines several primitives for communicating between dataflow workers, and across machine boundaries.

Progress tracking: The timely::progress module defines core dataflow structures for tracking and reporting progress in a timely dataflow system, namely the number of outstanding dataflow messages and un-exercised message capabilities throughout the timely dataflow graph. It depends on timely_communication to exchange progress messages.

Dataflow construction: The timely::dataflow module defines an example dataflow system using communication and progress to both exchange data and progress information, in support of an actual data-parallel timely dataflow computation. It depends on timely_communication to move data, and timely::progress to provide correct operator notifications.

Examples

The following is a hello-world dataflow program.

use timely::*;
use timely::dataflow::operators::{Input, Inspect};

// construct and execute a timely dataflow
timely::execute_from_args(std::env::args(), |worker| {

    // add an input and base computation off of it
    let mut input = worker.dataflow(|scope| {
        let (input, stream) = scope.new_input();
        stream.inspect(|x| println!("hello {:?}", x));
        input
    });

    // introduce input, advance computation
    for round in 0..10 {
        input.send(round);
        input.advance_to(round + 1);
        worker.step();
    }
});

The program uses timely::execute_from_args to spin up a computation based on command line arguments and a closure specifying what each worker should do, in terms of a handle to a timely dataflow Scope (in this case, root). A Scope allows you to define inputs, feedback cycles, and dataflow subgraphs, as part of building the dataflow graph of your dreams.

In this example, we define a new scope of root using scoped, add an exogenous input using new_input, and add a dataflow inspect operator to print each observed record. We then introduce input at increasing rounds, indicate the advance to the system (promising that we will introduce no more input at prior rounds), and step the computation.

Re-exports

pub use execute::execute;
pub use execute::execute_directly;
pub use execute::execute_from_args;
pub use execute::example;
pub use order::PartialOrder;

Modules

bytes

Re-export of the timely_bytes crate.

communication

Re-export of the timely_communication crate.

dataflow

Abstractions for timely dataflow programming.

execute

Starts a timely dataflow execution from configuration information and per-worker logic.

logging

Traits, implementations, and macros related to logging timely events.

logging_core

Re-export of the timely_logging crate.

order

Traits and types for partially ordered sets.

progress

Progress tracking mechanisms to support notification in timely dataflow

scheduling

Types and traits to activate and schedule fibers.

synchronization

Synchronization primitives implemented in timely dataflow.

worker

The root of each single-threaded worker.

Enums

Configuration

Possible configurations for the communication infrastructure.

Traits

Data

A composite trait for types usable as data in timely dataflow.

ExchangeData

A composite trait for types usable on exchange channels in timely dataflow.