pub struct ReluKernel;Expand description
Rectified Linear Unit (ReLU) activation kernel.
Computes output[i] = max(0, input[i]) elementwise.
NaN handling: if an input element is NaN, the corresponding output element is set to NaN.
§Errors
Returns KernelError::InvalidArguments if inputs.len() != 1.
Returns KernelError::ShapeMismatch if input.shape() != output.shape().
§Examples
let shape = vec![1, 5];
let x = Tensor::from_vec(shape.clone(), vec![-2.0, -0.0, 0.0, 1.5, 3.0]).unwrap();
let mut out = Tensor::zeros(shape).unwrap();
ReluKernel.compute(&[&x], &mut out).unwrap();
assert_eq!(out.data(), &[0.0, 0.0, 0.0, 1.5, 3.0]);Trait Implementations§
Auto Trait Implementations§
impl Freeze for ReluKernel
impl RefUnwindSafe for ReluKernel
impl Send for ReluKernel
impl Sync for ReluKernel
impl Unpin for ReluKernel
impl UnsafeUnpin for ReluKernel
impl UnwindSafe for ReluKernel
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more