Loss

Trait Loss 

Source
pub trait Loss<F: Float + Debug> {
    // Required methods
    fn forward(
        &self,
        predictions: &Array<F, IxDyn>,
        targets: &Array<F, IxDyn>,
    ) -> Result<F>;
    fn backward(
        &self,
        predictions: &Array<F, IxDyn>,
        targets: &Array<F, IxDyn>,
    ) -> Result<Array<F, IxDyn>>;
}
Expand description

Trait for loss functions used in neural networks

Loss functions compute the error between model predictions and target values, providing both the scalar loss value and gradients needed for training.

§Examples

use scirs2_neural::losses::{Loss, MeanSquaredError};
use scirs2_core::ndarray::Array;
let loss_fn = MeanSquaredError::new();
let predictions = Array::from_vec(vec![1.0, 2.0, 3.0]).into_dyn();
let targets = Array::from_vec(vec![1.1, 1.9, 3.2]).into_dyn();
// Forward pass: compute loss value
let loss_value = loss_fn.forward(&predictions, &targets)?;
assert!(loss_value >= 0.0); // Loss is non-negative
// Backward pass: compute gradients
let gradients = loss_fn.backward(&predictions, &targets)?;
assert_eq!(gradients.shape(), predictions.shape());

Required Methods§

Source

fn forward( &self, predictions: &Array<F, IxDyn>, targets: &Array<F, IxDyn>, ) -> Result<F>

Calculate the loss between predictions and targets

§Arguments
  • predictions - Model predictions (can be logits or probabilities depending on loss)
  • targets - Ground truth target values
§Returns

A scalar loss value (typically averaged over the batch)

§Examples
use scirs2_neural::losses::{Loss, MeanSquaredError};
use scirs2_core::ndarray::Array;
let mse = MeanSquaredError::new();
let predictions = Array::from_vec(vec![1.0, 2.0]).into_dyn();
let targets = Array::from_vec(vec![1.5, 1.8]).into_dyn();
let loss = mse.forward(&predictions, &targets)?;
// MSE = mean((predictions - targets)^2) = mean([0.25, 0.04]) = 0.145
assert!((loss - 0.145f64).abs() < 1e-6);
Source

fn backward( &self, predictions: &Array<F, IxDyn>, targets: &Array<F, IxDyn>, ) -> Result<Array<F, IxDyn>>

Calculate the gradient of the loss with respect to the predictions This method computes ∂Loss/∂predictions, which is used in backpropagation to update the model parameters.

  • predictions - Model predictions (same as used in forward pass)
  • targets - Ground truth target values (same as used in forward pass) Gradient tensor with the same shape as predictions let targets = Array::from_vec(vec![0.0, 1.0]).into_dyn(); let gradients = mse.backward(&predictions, &targets)?; // MSE gradient = 2 * (predictions - targets) / n // For our example: [2*(1-0)/2, 2*(2-1)/2] = [1.0, 1.0] assert_eq!(gradients.as_slice().unwrap(), &[1.0, 1.0]);

Implementors§