Expand description
candle-nn
§Other Crates
Candle consists of a number of crates. This crate holds structs and functions that allow you to build and train neural nets. You may wish to look at the docs for the other crates which can be found here:
- candle-core. Core Datastructures and DataTypes.
- candle-nn. Building blocks for Neural Nets.
- candle-datasets. Rust access to commonly used Datasets like MNIST.
- candle-examples. Examples of Candle in Use.
- candle-onnx. Loading and using ONNX models.
- candle-pyo3. Access to Candle from Python.
- candle-transformers. Candle implemntation of many published transformer models.
Re-exports§
pub use activation::prelu;
pub use activation::Activation;
pub use activation::PReLU;
pub use batch_norm::batch_norm;
pub use batch_norm::BatchNorm;
pub use batch_norm::BatchNormConfig;
pub use conv::conv1d;
pub use conv::conv1d_no_bias;
pub use conv::conv2d;
pub use conv::conv2d_no_bias;
pub use conv::conv_transpose1d;
pub use conv::conv_transpose1d_no_bias;
pub use conv::conv_transpose2d;
pub use conv::conv_transpose2d_no_bias;
pub use conv::Conv1d;
pub use conv::Conv1dConfig;
pub use conv::Conv2d;
pub use conv::Conv2dConfig;
pub use conv::ConvTranspose1d;
pub use conv::ConvTranspose1dConfig;
pub use conv::ConvTranspose2d;
pub use conv::ConvTranspose2dConfig;
pub use embedding::embedding;
pub use embedding::Embedding;
pub use func::func;
pub use func::func_t;
pub use func::Func;
pub use func::FuncT;
pub use group_norm::group_norm;
pub use group_norm::GroupNorm;
pub use init::Init;
pub use layer_norm::layer_norm;
pub use layer_norm::layer_norm_no_bias;
pub use layer_norm::rms_norm;
pub use layer_norm::LayerNorm;
pub use layer_norm::LayerNormConfig;
pub use layer_norm::RmsNorm;
pub use linear::linear;
pub use linear::linear_b;
pub use linear::linear_no_bias;
pub use linear::Linear;
pub use ops::Dropout;
pub use optim::AdamW;
pub use optim::Optimizer;
pub use optim::ParamsAdamW;
pub use optim::SGD;
pub use rnn::gru;
pub use rnn::lstm;
pub use rnn::GRUConfig;
pub use rnn::LSTMConfig;
pub use rnn::GRU;
pub use rnn::LSTM;
pub use rnn::RNN;
pub use sequential::seq;
pub use sequential::Sequential;
pub use var_builder::VarBuilder;
pub use var_map::VarMap;
Modules§
- activation
- Activation Functions
- batch_
norm - Batch Normalization.
- conv
- Convolution Layers.
- embedding
- Embedding Layer.
- encoding
- Encoding Utilities. (e.g., one-hot/cold encoding)
- func
- Layers defined by closures.
- group_
norm - Group Normalization.
- init
- Variable initialization.
- kv_
cache - Cache Implementations
- layer_
norm - Layer Normalization.
- linear
- Linear layer
- loss
- Loss Calculations
- ops
- Tensor ops.
- optim
- Various optimization algorithms.
- rnn
- Recurrent Neural Networks
- rotary_
emb - Rotary Embeddings
- sequential
- Sequential Layer
- var_
builder - A
VarBuilder
for variable retrieval from models - var_map
- A
VarMap
is a store that holds named variables.