Expand description
Mixture of Experts (MoE) ensemble learning (GH-101)
Structs§
- Mixture
OfExperts - Mixture of Experts ensemble
- MoeConfig
MoErouting configuration- Softmax
Gating - Softmax gating with learnable weights
Traits§
- Gating
Network - Trait for gating networks that route inputs to experts