Trait burn_tensor::ops::ActivationOps
source · pub trait ActivationOps<B: Backend> {
// Provided methods
fn relu<const D: usize>(
tensor: B::TensorPrimitive<D>
) -> B::TensorPrimitive<D> { ... }
fn relu_backward<const D: usize>(
output: B::TensorPrimitive<D>,
grad: B::TensorPrimitive<D>
) -> B::TensorPrimitive<D> { ... }
fn gelu<const D: usize>(
tensor: B::TensorPrimitive<D>
) -> B::TensorPrimitive<D> { ... }
fn gelu_backward<const D: usize>(
x: B::TensorPrimitive<D>,
grad: B::TensorPrimitive<D>
) -> B::TensorPrimitive<D> { ... }
}Expand description
Activation function operations.
This trait let backend implementations override activation functions for better performance.