Expand description
Neural network modules for MLX.
Provides common building blocks: Linear, Embedding, LayerNorm, and
RMSNorm. Each module stores its parameters as Tensor values and exposes
a forward() method.
Structs§
- Dropout
- Dropout: randomly zeros elements with probability
pduring training. - Embedding
- Embedding layer: maps integer indices to dense vectors.
- Layer
Norm - Layer normalization over the last dimension.
- Linear
- A linear (fully-connected) layer:
y = x @ W^T + b. - Multi
Head Attention - Multi-head attention.
- RmsNorm
- RMS normalization over the last dimension.
Traits§
- Module
- Trait implemented by all NN modules.