Module pipeline

Module pipeline 

Source
Expand description

Vortex crate containing vectorized pipeline processing.

This module contains experiments into pipelined data processing within Vortex.

Arrays (and eventually Layouts) will be convertible into a Kernel that can then be exported into a ViewMut one chunk of N elements at a time. This allows us to keep compute largely within the L1 cache, as well as to write out canonical data into externally provided buffers.

Each chunk is represented in a canonical physical form, as determined by the logical vortex_dtype::DType of the array. This provides a predicate base on which to perform compute. Unlike DuckDB and other vectorized systems, we force a single canonical representation instead of supporting multiple encodings because compute push-down is applied a priori to the logical representation.

It is a work-in-progress and is not yet used in production.

Re-exports§

pub use operators::Operator;

Modules§

bits
operators
Plan nodes represent the logical structure of a pipeline.
query
vec
Vectors contain owned fixed-size canonical arrays of elements.
view

Structs§

KernelContext
Context passed to kernels during execution, providing access to vectors.

Enums§

VType
Defines the “vector type”, a physical type describing the data that’s held in the vector.

Constants§

N
The number of elements in each step of a Vortex evaluation pipeline.
N_WORDS

Traits§

Element
A trait to identify canonical vector types.
Kernel
A pipeline provides a push-based way to emit a stream of canonical data.
PipelineVTable

Functions§

export_canonical_pipeline
Export canonical data from a pipeline kernel with the given mask.
export_canonical_pipeline_expr
Export canonical data from an operator expression with the given mask.
export_canonical_pipeline_expr_offset
Export canonical data from an operator expression with a starting offset and mask.