pub trait Relu<F>: NN<F> {
// Required methods
fn relu(
&self,
x: &SharedTensor<F>,
result: &mut SharedTensor<F>,
) -> Result<(), Error>;
fn relu_grad(
&self,
x: &SharedTensor<F>,
x_diff: &SharedTensor<F>,
result: &SharedTensor<F>,
result_diff: &mut SharedTensor<F>,
) -> Result<(), Error>;
}Expand description
Provides the functionality for a Backend to support ReLU operations.
Required Methods§
Sourcefn relu(
&self,
x: &SharedTensor<F>,
result: &mut SharedTensor<F>,
) -> Result<(), Error>
fn relu( &self, x: &SharedTensor<F>, result: &mut SharedTensor<F>, ) -> Result<(), Error>
Computes the [Rectified linear units][relu] over the input Tensor x.
[relu]: https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
Saves the result to result.
Sourcefn relu_grad(
&self,
x: &SharedTensor<F>,
x_diff: &SharedTensor<F>,
result: &SharedTensor<F>,
result_diff: &mut SharedTensor<F>,
) -> Result<(), Error>
fn relu_grad( &self, x: &SharedTensor<F>, x_diff: &SharedTensor<F>, result: &SharedTensor<F>, result_diff: &mut SharedTensor<F>, ) -> Result<(), Error>
Computes the gradient of [ReLU][relu] over the input Tensor x.
[relu]: https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
Saves the result to result_diff.
Dyn Compatibility§
This trait is not dyn compatible.
In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.