Skip to main content

ActivationOps

Trait ActivationOps 

Source
pub trait ActivationOps: Send + Sync {
    // Required methods
    fn silu_mul(&self, gate: &TensorRef, up: &TensorRef) -> Result<TensorRef>;
    fn gelu(&self, input: &TensorRef) -> Result<TensorRef>;
}
Expand description

Activation function operations (including fused variants).

Required Methods§

Source

fn silu_mul(&self, gate: &TensorRef, up: &TensorRef) -> Result<TensorRef>

Fused SiLU-multiply: silu(gate) * up.

This is the SwiGLU building block used in LLaMA/Qwen MLPs.

Source

fn gelu(&self, input: &TensorRef) -> Result<TensorRef>

GELU activation.

Implementors§