Expand description
Neurox — minimalist numerical and ML building blocks in Rust.
This crate provides:
- A
Tensortype for numeric data (row-major) - Core tensor ops (matrix multiplication, element-wise add/mul, broadcasting)
- Activation functions (ReLU, Sigmoid, Tanh, Softmax)
- A
Denselayer with backprop and cached activations - A sequential
Modelwith forward pass and training via SGD/Adam - Losses (MSE, Cross-Entropy), data utilities, and error types
All operations currently run on CPU and are designed for clarity and extensibility.
Example: small MLP with ReLU
ⓘ
use neurox::{Model, Tensor, Activation};
// 3 -> 4 -> 2 MLP
let mut model = Model::new(&[3, 4, 2], Activation::ReLU);
// batch of 8 samples with 3 features
let x = Tensor::random(8, 3);
let logits = model.forward(&x).unwrap();
let probs = neurox::activations::softmax(&logits);
println!("probs: {:?}", probs);Re-exports§
pub use crate::model::Model;pub use crate::tensor::Tensor;pub use crate::layers::Dense;pub use crate::layers::Activation;pub use crate::optimizer::SGD;pub use crate::optimizer::Adam;pub use crate::errors::NeuroxError;pub use crate::errors::NeuroxResult;
Modules§
- activations
- Provides activation functions and their derivatives for neural networks.
- data
- Provides utilities for data loading and manipulation.
- errors
- layers
- Defines the layers of a neural network, such as the
Denselayer. - loss
- model
- Defines the main
Modelstruct, its training loops, and evaluation utilities. - ops
- Provides basic mathematical operations for
Tensors. - optimizer
- Provides optimization algorithms for updating model parameters.
- prelude
- Prelude with the most commonly used items.
- tensor
- Defines the core
Tensorstruct and its associated methods. - utils