Expand description
Parallel optimization algorithms for multi-threaded training.
This module provides thread-safe optimizers that can leverage multiple CPU cores for parallel parameter updates, improving performance on multi-core systems.
§Key Features
- Thread-Safe State Management: Lock-free and fine-grained locking strategies
- Parallel Parameter Updates: Distribute parameter updates across threads
- Work Stealing: Dynamic load balancing for uneven parameter distributions
- NUMA Awareness: Optimize for Non-Uniform Memory Access architectures
- Scalability: Efficient scaling from 2 to 64+ cores
Structs§
- Parallel
Adam - Parallel Adam optimizer with multi-threaded parameter updates.
- Parallel
Config - Configuration for parallel optimization.
- Parallel
Optimizer State - Thread-safe optimizer state with fine-grained locking.
- Parallel
Stats - Performance statistics for parallel optimization.
- Parameter
State - Individual parameter state with momentum and variance.
Traits§
- Batch
Update - Batch parameter update interface for better parallelization.