Module functional

Module functional 

Source
Expand description

Functional interface for neural network operations.

This module provides stateless functions that mirror the module-based activations and operations. Use these when you don’t need a module wrapper (e.g., in custom forward passes).

§Example

use aprender::nn::F;
use aprender::autograd::Tensor;

let x = Tensor::randn(&[32, 10]);
let y = F::relu(&x);
let probs = F::softmax(&y, -1);

Functions§

dropout
Dropout (must be called with training flag)
gelu
GELU activation (Gaussian Error Linear Unit)
leaky_relu
Leaky ReLU activation: max(negative_slope * x, x)
linear
Linear transformation: y = x @ weight^T + bias
log_softmax
Log softmax along a dimension
relu
ReLU activation: max(0, x)
sigmoid
Sigmoid activation: 1 / (1 + exp(-x))
softmax
Softmax along a dimension
tanh
Tanh activation