Skip to main content

Module regularization

Module regularization 

Source
Expand description

Regularization techniques for neural network training.

§Techniques

  • Mixup: Interpolate between training samples
  • Label Smoothing: Soft targets instead of hard labels
  • CutMix: Cut and paste patches between images

Structs§

CutMix
CutMix data augmentation (Yun et al., 2019).
CutMixParams
Parameters for a CutMix operation.
LabelSmoothing
Label smoothing for soft targets. Converts hard labels to: (1-ε)y + ε/K
Mixup
Mixup data augmentation (Zhang et al., 2018). Creates virtual training examples: x’ = λx_i + (1-λ)x_j
RDrop
R-Drop regularization (Liang et al., 2021).
RandAugment
RandAugment: Automated data augmentation policy (Cubuk et al., 2020).
SpecAugment
SpecAugment: Data augmentation for speech recognition (Park et al., 2019).
StochasticDepth
Stochastic Depth (Huang et al., 2016).

Enums§

AugmentationType
Types of image augmentations.
DropMode

Functions§

cross_entropy_with_smoothing
Cross-entropy loss with label smoothing.