logo

Function neuronika::nn::loss::nll_loss[][src]

pub fn nll_loss<T, U, V>(
    input: VarDiff<T, U>,
    target: Var<V>,
    reduction: Reduction
) -> VarDiff<NLLLoss<T, V>, NLLLossBackward<U, V>> where
    T: Data<Dim = <V::Dim as Dimension>::Larger>,
    U: Gradient<Dim = T::Dim> + Overwrite,
    V: Data,
    T::Dim: Copy
Expand description

Computes the negative log likelihood between the target y and input x.

        1   n
Lᴏss =  ―   ∑  - xₙ,ᵧₙ
        n  i=1

The input x given is expected to contain log-probabilities for each class, this is typically achieved by using .log_softmax(). input has to be a of size either (minibatch, C) or (minibatch, C, d1, d2, …, dk) with k >= 1 for the K-dimensional case. The target that this loss expects should be a class index in the range [0, C) where C = number of classes. When the given reduction is equal to Reduction::Mean the total loss is divided by the batch size.

As mentioned before, this loss can also be used for higher dimensional inputs, such as 2D images, by providing an input of size (minibatch, C, d1, d2, …, dk) with k >= 1 where k is the number of dimensions. In the case of images, it computes NLL loss per-pixel.

In the K-dimensional case this loss expects a target of shape (minibatch, d1, d2, …, dk).