Skip to main content

Module hyperparameter_tuning

Module hyperparameter_tuning 

Source
Expand description

§Automated Hyperparameter Tuning Framework

This module provides state-of-the-art automated hyperparameter optimization for all TrustformeRS optimizers using modern optimization techniques including Bayesian optimization, TPE (Tree-structured Parzen Estimator), and multi-objective optimization for the 2025 era.

§Key Features

  • Bayesian Optimization: Uses Gaussian processes for efficient hyperparameter search
  • Multi-Objective Optimization: Simultaneously optimizes convergence speed and stability
  • Adaptive Sampling: Intelligent exploration vs exploitation balance
  • Transfer Learning: Leverages previous optimization results across tasks
  • Ensemble Methods: Combines multiple tuning strategies for robustness
  • Real-time Adaptation: Adjusts hyperparameters during training based on performance

§Supported Optimizers

Works with all TrustformeRS optimizers including aMacP, NovoGrad, Adam, AdamW, LAMB, Lion, Sophia, and 40+ other variants.

Structs§

BayesianOptimizer
Bayesian optimization state using Tree-structured Parzen Estimator (TPE)
HyperparameterSample
Individual hyperparameter configuration sample
HyperparameterSpace
Hyperparameter search space definition
HyperparameterTuner
Complete hyperparameter tuning framework
MultiObjectiveOptimizer
Multi-objective hyperparameter optimizer
OptimizationTask
Training task definition for hyperparameter optimization
PerformanceMetrics
Performance metrics for hyperparameter evaluation

Enums§

OptimizerType
TaskType