logo

Function neuronika::nn::loss::kldiv_loss[][src]

pub fn kldiv_loss<T, U, V>(
    input: VarDiff<T, U>,
    target: Var<V>,
    reduction: Reduction
) -> VarDiff<KLDivLoss<T, V>, KLDivLossBackward<U, V>> where
    T: Data,
    U: Gradient<Dim = T::Dim> + Overwrite,
    V: Data<Dim = T::Dim>, 
Expand description

Computes the Kullback-Leibler divergence between the target and the input.

        n
Lᴏss =  ∑  ʏₙ * (ln(ʏₙ) - xₙ)
       i=1

The Kullback-Leibler divergence is a useful distance measure for continuous distributions and is often useful when performing direct regression over the space of (discretely sampled) continuous output distributions.

The input given is expected to contain log-probabilities and is not restricted to a 2D Tensor, while the targets are interpreted as probabilities. When the given reduction is equal to Reduction::Mean the total loss is divided by the batch size.

This criterion expects a target variable of the same size as the input variable.