llms_from_scratch_rs::examples

Module ch03

Source
Expand description

Examples from Chapter 3

Modules§

  • Auxiliary module for examples::ch03

Structs§

  • Computing attention scores as a dot product
  • Manual computation of multiple context vectors simultaneously
  • Implementing the self-attention mechanism with trainable weights
  • Example usage of SelfAttentionV1 to compute context vectors
  • Example usage of SelfAttentionV2 to compute context vectors
  • Compute causal attention weights
  • Compute causal attention weights more efficiently with f32::NEGATIVE_INFINITY
  • Dropout on attention weights
  • Example usage of CausalAttention
  • Example usage of MultiHeadAttentionWrapper
  • Example usage of MultiHeadAttention