pub fn gelu(input: &Tensor) -> Tensor
Elementwise GELU activation (fast approximation): x * sigmoid(1.702 * x).
x * sigmoid(1.702 * x)