Crate candle_nn

Source
Expand description

candle-nn

§Other Crates

Candle consists of a number of crates. This crate holds structs and functions that allow you to build and train neural nets. You may wish to look at the docs for the other crates which can be found here:

Re-exports§

pub use activation::prelu;
pub use activation::Activation;
pub use activation::PReLU;
pub use batch_norm::batch_norm;
pub use batch_norm::BatchNorm;
pub use batch_norm::BatchNormConfig;
pub use conv::conv1d;
pub use conv::conv1d_no_bias;
pub use conv::conv2d;
pub use conv::conv2d_no_bias;
pub use conv::conv_transpose1d;
pub use conv::conv_transpose1d_no_bias;
pub use conv::conv_transpose2d;
pub use conv::conv_transpose2d_no_bias;
pub use conv::Conv1d;
pub use conv::Conv1dConfig;
pub use conv::Conv2d;
pub use conv::Conv2dConfig;
pub use conv::ConvTranspose1d;
pub use conv::ConvTranspose1dConfig;
pub use conv::ConvTranspose2d;
pub use conv::ConvTranspose2dConfig;
pub use embedding::embedding;
pub use embedding::Embedding;
pub use func::func;
pub use func::func_t;
pub use func::Func;
pub use func::FuncT;
pub use group_norm::group_norm;
pub use group_norm::GroupNorm;
pub use init::Init;
pub use layer_norm::layer_norm;
pub use layer_norm::layer_norm_no_bias;
pub use layer_norm::rms_norm;
pub use layer_norm::LayerNorm;
pub use layer_norm::LayerNormConfig;
pub use layer_norm::RmsNorm;
pub use linear::linear;
pub use linear::linear_b;
pub use linear::linear_no_bias;
pub use linear::Linear;
pub use ops::Dropout;
pub use optim::AdamW;
pub use optim::Optimizer;
pub use optim::ParamsAdamW;
pub use optim::SGD;
pub use rnn::gru;
pub use rnn::lstm;
pub use rnn::GRUConfig;
pub use rnn::LSTMConfig;
pub use rnn::GRU;
pub use rnn::LSTM;
pub use rnn::RNN;
pub use sequential::seq;
pub use sequential::Sequential;
pub use var_builder::VarBuilder;
pub use var_map::VarMap;

Modules§

activation
Activation Functions
batch_norm
Batch Normalization.
conv
Convolution Layers.
embedding
Embedding Layer.
encoding
Encoding Utilities. (e.g., one-hot/cold encoding)
func
Layers defined by closures.
group_norm
Group Normalization.
init
Variable initialization.
kv_cache
Cache Implementations
layer_norm
Layer Normalization.
linear
Linear layer
loss
Loss Calculations
ops
Tensor ops.
optim
Various optimization algorithms.
rnn
Recurrent Neural Networks
rotary_emb
Rotary Embeddings
sequential
Sequential Layer
var_builder
A VarBuilder for variable retrieval from models
var_map
A VarMap is a store that holds named variables.

Traits§

Module
Defining a module with forward method using a single argument.
ModuleT
A single forward method using a single single tensor argument and a flag to separate the training and evaluation behaviors.