pub trait LogSoftmax<F>: NN<F> {
// Required methods
fn log_softmax(
&self,
x: &SharedTensor<F>,
result: &mut SharedTensor<F>,
) -> Result<(), Error>;
fn log_softmax_grad(
&self,
x: &SharedTensor<F>,
x_diff: &SharedTensor<F>,
result_diff: &mut SharedTensor<F>,
) -> Result<(), Error>;
}Expand description
Provides the functionality for a Backend to support LogSoftmax operations.
Required Methods§
Sourcefn log_softmax(
&self,
x: &SharedTensor<F>,
result: &mut SharedTensor<F>,
) -> Result<(), Error>
fn log_softmax( &self, x: &SharedTensor<F>, result: &mut SharedTensor<F>, ) -> Result<(), Error>
Computes a logarithmic softmax over the input Tensor x.
Saves the result to result.
Sourcefn log_softmax_grad(
&self,
x: &SharedTensor<F>,
x_diff: &SharedTensor<F>,
result_diff: &mut SharedTensor<F>,
) -> Result<(), Error>
fn log_softmax_grad( &self, x: &SharedTensor<F>, x_diff: &SharedTensor<F>, result_diff: &mut SharedTensor<F>, ) -> Result<(), Error>
Computes the gradient of a logarithmic softmax over the input Tensor x.
Saves the result to result_diff.
Dyn Compatibility§
This trait is not dyn compatible.
In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.