pub fn log_softmax<T: Reduce<Axes>, Axes>(t: T) -> T
Expand description

log(softmax(t)) in numerically stable way across Axes. Does t - logsumexp(t) under the hood.

Pytorch equivalent: t.log_softmax(Axes)

Related functions: logsumexp(), softmax()

Example:

let t: Tensor3D<2, 3, 5> = TensorCreator::zeros();
let _ = t.log_softmax::<Axis<2>>();

Using multi axis log_softmax:

let _ = t.log_softmax::<Axes2<0, 2>>();