Trait ActivationOps

Source
pub trait ActivationOps<B>
where B: Backend,
{ // Provided methods fn leaky_relu( tensor: <B as Backend>::FloatTensorPrimitive, negative_slope: <B as Backend>::FloatElem, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn relu( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn relu_backward( output: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn gelu( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn prelu( tensor: <B as Backend>::FloatTensorPrimitive, alpha: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn gelu_backward( x: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn sigmoid( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn sigmoid_backward( output: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn hard_sigmoid( tensor: <B as Backend>::FloatTensorPrimitive, alpha: <B as Backend>::FloatElem, beta: <B as Backend>::FloatElem, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn log_sigmoid( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } fn log_sigmoid_backward( x: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive { ... } }
Expand description

Activation function operations.

This trait let backend implementations override activation functions for better performance.

Provided Methods§

Source

fn leaky_relu( tensor: <B as Backend>::FloatTensorPrimitive, negative_slope: <B as Backend>::FloatElem, ) -> <B as Backend>::FloatTensorPrimitive

Applies the LeakyReLU activation function.

§Arguments
  • tensor - The tensor.
  • negative_slope - The negative_slope value that values smaller than 0 are multiplied with.
§Returns

The output tensor.

Source

fn relu( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the ReLU activation function.

§Arguments
  • tensor - The tensor.
§Returns

The output tensor.

Source

fn relu_backward( output: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the ReLU activation function backward.

§Arguments
  • output - The output tensor.
§Returns

The gradient.

Source

fn gelu( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the Gelu activation function.

§Arguments
  • tensor - The tensor.
§Returns

The output tensor.

Source

fn prelu( tensor: <B as Backend>::FloatTensorPrimitive, alpha: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the PReLu activation function.

§Arguments
  • tensor - The input tensor
  • alpha - The weight tensor
Source

fn gelu_backward( x: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the Gelu activation function backward.

§Arguments
  • x - The tensor.
  • grad - The gradient.
§Returns

The output tensor.

Source

fn sigmoid( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the Sigmoid activation function.

§Arguments
  • tensor - The tensor.
§Returns

The output tensor.

Source

fn sigmoid_backward( output: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the Sigmoid activation function backward.

§Arguments
  • output - The output tensor of the sigmoid function.
  • grad - The gradient.
§Returns

The output tensor.

Source

fn hard_sigmoid( tensor: <B as Backend>::FloatTensorPrimitive, alpha: <B as Backend>::FloatElem, beta: <B as Backend>::FloatElem, ) -> <B as Backend>::FloatTensorPrimitive

Applies the hard Sigmoid activation function.

§Arguments
  • tensor - The tensor.
  • alpha - The alpha value that the tensor is multiplied with.
  • beta - The beta value that is added to the tensor
§Returns

The output tensor.

Source

fn log_sigmoid( tensor: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the LogSigmoid activation function.

§Arguments
  • tensor - The tensor.
§Returns

The output tensor.

Source

fn log_sigmoid_backward( x: <B as Backend>::FloatTensorPrimitive, grad: <B as Backend>::FloatTensorPrimitive, ) -> <B as Backend>::FloatTensorPrimitive

Applies the LogSigmoid activation function backward.

§Arguments
  • x - The input tensor.
  • grad - The gradient.
§Returns

The output gradient.

Dyn Compatibility§

This trait is not dyn compatible.

In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.

Implementors§