Expand description
Regularization techniques for neural network training.
§Techniques
- Mixup: Interpolate between training samples
- Label Smoothing: Soft targets instead of hard labels
CutMix: Cut and paste patches between images
Structs§
- CutMix
CutMixdata augmentation (Yun et al., 2019).- CutMix
Params - Parameters for a
CutMixoperation. - Label
Smoothing - Label smoothing for soft targets. Converts hard labels to: (1-ε)y + ε/K
- Mixup
- Mixup data augmentation (Zhang et al., 2018).
Creates virtual training examples: x’ =
λx_i+ (1-λ)x_j - RDrop
- R-Drop regularization (Liang et al., 2021).
- Rand
Augment RandAugment: Automated data augmentation policy (Cubuk et al., 2020).- Spec
Augment SpecAugment: Data augmentation for speech recognition (Park et al., 2019).- Stochastic
Depth - Stochastic Depth (Huang et al., 2016).
Enums§
- Augmentation
Type - Types of image augmentations.
- Drop
Mode
Functions§
- cross_
entropy_ with_ smoothing - Cross-entropy loss with label smoothing.