logo

Function neuronika::nn::loss::bce_loss[][src]

pub fn bce_loss<T, U, V>(
    input: VarDiff<T, U>,
    target: Var<V>,
    reduction: Reduction
) -> VarDiff<BCELoss<T, V>, BCELossBackward<U, T, V>> where
    T: Data,
    U: Gradient<Dim = T::Dim> + Overwrite,
    V: Data<Dim = T::Dim>, 
Expand description

Computes the binary cross entropy between the target y and input x.

       1   n
Lᴏss = ―   ∑ - [ʏᵢ * ln(xᵢ) + (1 - ʏᵢ) * ln(1 - xᵢ)]
       n  i=1

Note that the target y should be numbers between 0 and 1. Notice that if a component of the input x is either 0 or 1, one of the log terms would be mathematically undefined in the above loss equation. Rust sets ln(0) = -inf, however, an infinite term in the loss equation is not desirable. Our solution is that BCELoss clamps its log function outputs to be greater than or equal to -100. This way, we can always have a finite loss value.