Expand description
§Neuro-Divergent Core
High-performance neural forecasting library for Rust, built on the ruv-FANN foundation. This crate provides the core abstractions, traits, and data structures needed for advanced time series forecasting with neural networks.
§Features
- Type-safe Neural Networks: Built on ruv-FANN with generic floating-point support
- Time Series Data Structures: Efficient handling of temporal data with exogenous variables
- Memory-safe Operations: Rust’s ownership model ensures memory safety without garbage collection
- Parallel Processing: Built-in support for multi-threaded training and inference
- Comprehensive Error Handling: Detailed error types for robust error management
- Flexible Configuration: Builder patterns and configuration systems for easy setup
§Architecture
The library is organized into several key modules:
traits: Core traits that define the forecasting model interfacedata: Time series data structures and preprocessing utilitieserror: Comprehensive error handling typesintegration: Integration layer with ruv-FANN neural networksconfig: Configuration management and builder patterns
§Quick Start
use neuro_divergent_core::prelude::*;
use chrono::{DateTime, Utc};
// Create a time series dataset
let mut dataset = TimeSeriesDatasetBuilder::new()
.with_target_column("value")
.with_time_column("timestamp")
.with_unique_id_column("series_id")
.build()?;
// Configure a forecasting model
let config = ModelConfigBuilder::new()
.with_horizon(12)
.with_input_size(24)
.build()?;
§Integration with ruv-FANN
This library extends ruv-FANN’s neural network capabilities with time series-specific functionality:
use neuro_divergent_core::integration::NetworkAdapter;
use ruv_fann::Network;
// Create a ruv-FANN network
let network = Network::new(&[24, 64, 32, 12])?;
// Wrap it with our adapter for time series forecasting
let adapter = NetworkAdapter::from_network(network)
.with_input_preprocessor(/* ... */)
.with_output_postprocessor(/* ... */);
Re-exports§
pub use crate::config::*;pub use crate::data::*;pub use crate::error::*;pub use crate::integration::*;pub use crate::traits::*;
Modules§
- aggregations
- arity
- binary
- chunkedarray
- Traits and utilities for temporal data.
- cloud
- Interface with cloud storage through the object_store crate.
- config
- Configuration management and builder patterns for neuro-divergent.
- consts
- data
- Time series data structures and processing utilities.
- datatypes
- Data types supported by Polars.
- default_
arrays - dt
- error
- Comprehensive error handling for the neuro-divergent library.
- expr
- fill_
null - full
- gather
- gather_
skip_ nulls - integration
- Integration layer with ruv-FANN neural networks.
- null
- predicates
- prelude
- Commonly used types and traits for convenient importing
- read_
impl - series
- slice
- sort
- sum_f64
- traits
- Core traits that define the forecasting model interface.
- udf
- utils
- zip
Macros§
- config_
error - Convenience macros for error creation
- data_
error - Create a data error with the given message
- df
- polars_
bail - polars_
ensure - polars_
err - polars_
warn - training_
error - Create a training error with the given message
Structs§
- Aggregation
Context - Anonymous
Scan Options - Arc
- A thread-safe reference-counting pointer. ‘Arc’ stands for ‘Atomically Reference Counted’.
- Arrow
Field - Represents Arrow’s metadata of a “column”.
- Arrow
Schema - An ordered sequence of
Fields with associatedMetadata. - Batched
Parquet Reader - Binary
Chunked Builder - Binary
Type - Boolean
Chunked Builder - Boolean
Type - Bounds
- Bounds
Iter - Brotli
Level - Categorical
Type - Chained
Then - Utility struct for the
when-then-otherwiseexpression. - Chained
When - Utility struct for the
when-then-otherwiseexpression. - Chunked
Array - ChunkedArray
- CsvReader
- Create a new DataFrame by reading a csv file.
- CsvWriter
- Write a DataFrame to csv.
- CsvWriter
Options - Data
Frame - A contiguous growable collection of
Seriesthat have the same length. - Date
Time - ISO 8601 combined date and time with time zone.
- Date
Type - Datetime
Args - Arguments used by
datetimein order to produce anExprof Datetime - Datetime
Type - Duration
- Duration
Args - Arguments used by
durationin order to produce anExprofDuration - Duration
Type - Dynamic
Group Options - Field
- Characterizes the name and the
DataTypeof a column. - Fields
Mapper - File
Meta Data - Metadata for a Parquet file.
- Flat
- Float32
Type - Float64
Type - GroupBy
- Returned by a group_by operation on a DataFrame. This struct supports several aggregations.
- Groups
Idx - Indexes of the groups, the first index is stored separately. this make sorting fast.
- Groups
Proxy Iter - Groups
Proxy ParIter - Gzip
Level - Int8
Type - Int16
Type - Int32
Type - Int64
Type - Join
Args - Join
Builder - Join
Options - Json
Line Reader - Json
Reader - Reads JSON in one of the formats in
JsonFormatinto a DataFrame. - Json
Writer - Writes a DataFrame to JSON.
- Lazy
CsvReader - Lazy
Frame - Lazy abstraction over an eager
DataFrame. It really is an abstraction over a logical plan. The methods of this struct will incrementally modify a logical plan until output is requested (viacollect). - Lazy
Group By - Utility struct for lazy group_by operation.
- Lazy
Json Line Reader - List
Binary Chunked Builder - List
Boolean Chunked Builder - List
Name Space - Specialized expressions for
SeriesofDataType::List. - List
Primitive Chunked Builder - List
Type - List
Utf8 Chunked Builder - Logical
- Maps a logical type to a a chunked array implementation of the physical type. This saves a lot of compiler bloat and allows us to reuse functionality.
- Melt
Args - Arguments for
[DataFrame::melt]function - Nested
- Network
- A feedforward neural network
- Network
Builder - Builder for creating neural networks with a fluent API
- NoNull
- Just a wrapper structure. Useful for certain impl specializations
This is for instance use to implement
impl<T> FromIterator<T::Native> for NoNull<ChunkedArray<T>>asOption<T::Native>was already implemented:impl<T> FromIterator<Option<T::Native>> for ChunkedArray<T> - Null
- The literal Null
- OptState
- State of the allowed optimizations
- Parquet
Reader - Read Apache parquet format into a DataFrame.
- Parquet
Write Options - Parquet
Writer - Write a DataFrame to parquet format
- Physical
IoHelper - Wrapper struct that allow us to use a PhysicalExpr in polars-io.
- Primitive
Chunked Builder - Rolling
CovOptions - Rolling
Group Options - Rolling
Quantile Params - Rolling
VarParams - Scan
Args Anonymous - Scan
Args Parquet - Schema
- A map from field/column name (
String) to the type of that field/column (DataType) - Serialize
Options - Options to serialize logical types to CSV.
- Series
- Series
- Sliced
Groups - Sort
Multiple Options - Sort
Options - Special
Eq - Wrapper type that has special equality properties depending on the inner type specialization
- Strptime
Options - Struct
Array - A
StructArrayis a nestedArraywith an optional validity representing multipleArraywith the same number of rows. - Struct
Chunked - This is logical type
StructChunkedthat dispatches most logic to thefieldsimplementations - Struct
Name Space - Specialized expressions for Struct dtypes.
- Then
- Utility struct for the
when-then-otherwiseexpression. - Time
Type - Training
Data - UInt8
Type - UInt16
Type - UInt32
Type - UInt64
Type - Union
Args - User
Defined Function - Represents a user-defined function
- Utc
- The UTC time zone. This is the most efficient time zone when you don’t need the local time. It is also used as an offset (which is also a dummy type).
- Utf8
Chunked Builder - Utf8
Type - When
- Utility struct for the
when-then-otherwiseexpression. - Window
- Represents a window in time
- Zstd
Level - Represents a valid zstd compression level.
Enums§
- Activation
Function - Activation functions available for neurons
- AggExpr
- AnyValue
- Arrow
Data Type - The set of supported logical types in this crate.
- Arrow
Time Unit - The time units defined in Arrow.
- Boolean
Function - Closed
Window - CsvEncoding
- Data
Type - Excluded
- Expr
- Expressions that can be used in various contexts. Queries consist of multiple expressions. When using the polars
lazy API, don’t construct an
Exprdirectly; instead, create one using the functions in thepolars_lazy::dslmodule. See that module’s docs for more info. - Fill
Null Strategy - Function
Expr - Group
ByMethod - Groups
Indicator - Groups
Proxy - Join
Type - Join
Validation - Json
Format - The format to use to write the DataFrame to JSON:
Json(a JSON array) orJsonLines(each row output on a separate line). In either case, each row is serialized as a JSON object whose keys are the column names and whose values are the row’s corresponding values. - Label
- Literal
Value - Logical
Plan - Null
Values - Operator
- Parallel
Strategy - Parquet
Compression - Polars
Error - Quantile
Interpol Options - Quote
Style - StartBy
- Time
Unit - Unique
Keep Strategy - Window
Mapping - Window
Type
Constants§
- DESCRIPTION
- Library description
- IDX_
DTYPE - NAME
- Library name
- NULL
- VERSION
- Library version information
Statics§
Traits§
- Anonymous
Scan - ArgAgg
- Argmin/ Argmax
- Array
Collect Iter Ext - Array
From Iter - Array
From Iter Dtype - AsBinary
- AsList
- AsRef
Data Type - AsUtf8
- Binary
Name Space Impl - Binary
UdfOutput Field - Chunk
Agg - Aggregation operations.
- Chunk
AggSeries - Aggregations that return
Seriesof unit length. Those can be used in broadcasting operations. - Chunk
AnyValue - Chunk
Apply - Fastest way to do elementwise operations on a
ChunkedArray<T>when the operation is cheaper than branching due to null checking. - Chunk
Apply Kernel - Apply kernels on the arrow array chunks in a ChunkedArray.
- Chunk
Bytes - Chunk
Cast - Cast
ChunkedArray<T>toChunkedArray<N> - Chunk
Compare - Compare
SeriesandChunkedArray’s and get abooleanmask that can be used to filter rows. - Chunk
Expand AtIndex - Create a new ChunkedArray filled with values at that index.
- Chunk
Explode - Explode/ flatten a List or Utf8 Series
- Chunk
Fill Null Value - Replace None values with a value
- Chunk
Filter - Filter values by a boolean mask.
- Chunk
Full - Fill a ChunkedArray with one value.
- Chunk
Full Null - Chunk
Quantile - Quantile and median aggregation.
- Chunk
Reverse - Reverse a
ChunkedArray<T> - Chunk
Set - Create a
ChunkedArraywith new values by index or by boolean mask. Note that these operations clone data. This is however the only way we can modify at mask or index level as the underlying Arrow arrays are immutable. - Chunk
Shift - Chunk
Shift Fill - Shift the values of a
ChunkedArrayby a number of periods. - Chunk
Sort - Sort operations on
ChunkedArray. - Chunk
Take - Chunk
Take Unchecked - Chunk
Unique - Get unique values in a
ChunkedArray - Chunk
Var - Variance and standard deviation aggregation.
- Chunk
Zip - Combine two
ChunkedArraybased on some predicate. - Chunked
Builder - Chunked
Collect Infer Iter Ext - Chunked
Collect Iter Ext - Chunked
Set - Cross
Join - Data
Frame Join Ops - Data
Frame Ops - Date
Methods - Datetime
Methods - Deserialize
- A data structure that can be deserialized from any data format supported by Serde.
- Duration
Methods - Float
- Generic trait for floating point numbers
- From
Data - From
Data Binary - From
Data Utf8 - Function
Output Field - GetAny
Value - Index
OfSchema - This trait exists to be unify the API of polars Schema and arrows Schema
- Index
ToUsize - Init
Hash Maps - Init
Hash Maps2 - Into
Groups Proxy - Used to create the tuples for a group_by operation.
- Into
Lazy - Into
Series - Used to convert a
ChunkedArray,&dyn SeriesTraitandSeriesinto aSeries. - IntoVec
- IsFloat
- Safety
- Join
Dispatch - Lazy
File List Reader - Reads LazyFrame from a filesystem or a cloud storage. Supports glob patterns.
- LhsNum
Ops - List
Builder Trait - List
From Iter - List
Name Space Impl - Literal
- Logical
Type - Mutable
Bitmap Extension - Named
From - Named
From Owned - NewChunked
Array - NumOps
Dispatch - Numeric
Native - Partitioned
Aggregation - Physical
Expr - Take a DataFrame and evaluate the expressions. Implement this for Column, lt, eq, etc
- Polars
Array - Polars
Data Type - Safety
- Polars
Float Type - Polars
Integer Type - Polars
Iterator - A
PolarsIteratoris an iterator over aChunkedArraywhich contains polars types. APolarsIteratormust implementExactSizeIteratorandDoubleEndedIterator. - Polars
Month End - Polars
Month Start - Polars
Numeric Type - Polars
Round - Polars
Temporal Groupby - Polars
Truncate - Polars
Upsample - Quantile
AggSeries - Rename
Alias Fn - SerReader
- SerWriter
- Serialize
- A data structure that can be serialized into any data format supported by Serde.
- Series
Binary Udf - A wrapper trait for any binary closure
Fn(Series, Series) -> PolarsResult<Series> - Series
Join - Series
Methods - Series
Sealed - Series
Trait - Series
Udf - A wrapper trait for any closure
Fn(Vec<Series>) -> PolarsResult<Series> - Sliced
Array - Utility trait to slice concrete arrow arrays whilst keeping their
concrete type. E.g. don’t return
Box<dyn Array>. - Static
Array - Temporal
Methods - Time
Methods - Training
Algorithm - Main trait for training algorithms
- UdfSchema
- Utf8
Methods - Value
Size - VarAgg
Series - VecHash
Functions§
- _join_
suffix_ name - _sort_
or_ hash_ inner - all
- Selects all columns. Shorthand for
col("*"). - apply_
binary - Like
map_binary, but used in a group_by-aggregation context. - apply_
multiple - Apply a function/closure over the groups of multiple columns. This should only be used in a group_by aggregation.
- as_
struct - Take several expressions and collect them into a
StructChunked. - avg
- Find the mean of all the values in the column named
name. Alias formean. - binary_
expr - Compute
op(l, r)(or equivalentlyl op r).landrmust have types compatible with the Operator. - cast
- Casts the column given by
Exprto a different type. - check_
projected_ arrow_ schema - Checks if the projected columns are equal
- check_
projected_ schema - Checks if the projected columns are equal
- check_
projected_ schema_ impl - clip
- Clamp underlying values to the
minandmaxvalues. - clip_
max - Clamp underlying values to the
maxvalue. - clip_
min - Clamp underlying values to the
minvalue. - coalesce
- Folds the expressions from left to right keeping the first non-null values.
- col
- Create a Column Expression based on a column name.
- collect_
all - Collect all
LazyFramecomputations. - cols
- Select multiple columns by name.
- concat
- Concat multiple
LazyFrames vertically. - concat_
expr - concat_
list - Concat lists entries.
- count
- Count expression.
- cum_
fold_ exprs - Accumulate over multiple columns horizontally / row wise.
- cum_
reduce_ exprs - Accumulate over multiple columns horizontally / row wise.
- date_
range - Create a
DatetimeChunkedfrom a givenstartandenddate and a giveninterval. - datetime
- Construct a column of
Datetimefrom the providedDatetimeArgs. - datetime_
to_ timestamp_ ms - datetime_
to_ timestamp_ ns - datetime_
to_ timestamp_ us - default_
join_ ids - dtype_
col - Select multiple columns by dtype.
- dtype_
cols - Select multiple columns by dtype.
- duration
- Construct a column of
Durationfrom the providedDurationArgs - first
- First column in DataFrame.
- fmt_
group_ by_ column - fold_
exprs - Accumulate over multiple columns horizontally / row wise.
- get_
reader_ bytes - get_
sequential_ row_ statistics - Compute
remaining_rows_to_readto be taken per file up front, so we can actually read concurrently/parallel - group_
by_ values - Different from
group_by_windows, where define window buckets and search which values fit that pre-defined bucket, this function defines every window based on the: - timestamp (lower bound) - timestamp + period (upper bound) where timestamps are the individual values in the arraytime - group_
by_ windows - Based on the given
Window, which has an - in_
nanoseconds_ window - indexes_
to_ usizes - is_
not_ null - A column which is
falsewhereverexpris null,trueelsewhere. - is_null
- A column which is
truewhereverexpris null,falseelsewhere. - last
- Last column in DataFrame.
- lit
- Create a Literal Expression from
L. A literal expression behaves like a column that contains a single distinct value. - map_
binary - Apply a closure on the two columns that are evaluated from
Expra andExprb. - map_
list_ multiple - Apply a function/closure over multiple columns once the logical plan get executed.
- map_
multiple - Apply a function/closure over multiple columns once the logical plan get executed.
- materialize_
projection - max
- Find the maximum of all the values in the column named
name. Shorthand forcol(name).max(). - mean
- Find the mean of all the values in the column named
name. Shorthand forcol(name).mean(). - median
- Find the median of all the values in the column named
name. Shorthand forcol(name).median(). - merge_
dtypes - min
- Find the minimum of all the values in the column named
name. Shorthand forcol(name).min(). - not
- Negates a boolean column.
- private_
left_ join_ multiple_ keys - quantile
- Find a specific quantile of all the values in the column named
name. - reduce_
exprs - Analogous to
Iterator::reduce. - repeat
- Create a column of length
ncontainingncopies of the literalvalue. Generally you won’t need this function, aslit(value)already represents a column containing onlyvaluewhose length is automatically set to the correct number of rows. - resolve_
homedir - sum
- Sum all the values in the column named
name. Shorthand forcol(name).sum(). - ternary_
expr - time_
range - Create a
TimeChunkedfrom a givenstartandenddate and a giveninterval. - unix_
time - when
- Start a
when-then-otherwiseexpression.
Type Aliases§
- Allowed
Optimizations - AllowedOptimizations
- Array1
- one-dimensional array
- Array2
- two-dimensional array
- Array
Ref - Array
View1 - one-dimensional array view
- Array
View2 - two-dimensional array view
- Binary
Chunked - Boolean
Chunked - Borrow
IdxItem - ChunkId
- [ChunkIdx, DfIdx]
- Chunk
Join Ids - Chunk
Join OptIds - Date
Chunked - Datetime
Chunked - Duration
Chunked - DynArgs
- File
Meta Data Ref - Fill
Null Limit - Float32
Chunked - Float64
Chunked - GetOutput
- Groups
Slice - Every group is indicated by an array where the
- IdxArr
- IdxCa
- IdxItem
- IdxSize
- The type used by polars to index data.
- IdxType
- Inner
Join Ids - Int8
Chunked - Int16
Chunked - Int32
Chunked - Int64
Chunked - Large
Binary Array - Large
List Array - Large
String Array - Left
Join Ids - List
Chunked - Path
Iterator - PlHash
Map - PlHash
Set - PlId
Hash Map - This hashmap uses an IdHasher
- PlIndex
Map - PlIndex
Set - Polars
Result - Schema
Ref - Time
Chunked - Time
Zone - UInt8
Chunked - UInt16
Chunked - UInt32
Chunked - UInt64
Chunked - Utf8
Chunked