//! Fusion optimization passes: MatMul+Add, Conv+BatchNorm, Conv+Relu,
//! Conv+ReLU6, SiLU (Mul+Sigmoid), Div+Sqrt→Rsqrt, standalone BatchNorm
//! folding, LayerNorm pattern, consecutive Transpose/Reshape cancellation,
//! MatMul+Transpose fusion, Add+MatMul→Gemm fusion, Conv+Add+Relu (ResNet),
//! Gather+Gather composition, Dropout elimination, Transpose+Reshape simplification.
pub use ;
pub use ;
pub use ;