Expand description
End-to-end knowledge distillation CLI.
This crate provides a complete pipeline for knowledge distillation:
- Fetch teacher models from HuggingFace
- Configure distillation parameters via YAML
- Train student models with progressive/attention distillation
- Export to SafeTensors, GGUF, or APR formats
§Toyota Way Principles
- Jidoka: Pre-flight validation catches errors before expensive training
- Heijunka: Memory estimation enables level scheduling of GPU resources
- Kaizen: Configurable hyperparameters enable continuous improvement
Re-exports§
pub use config::DistillConfig;pub use pipeline::Pipeline;pub use pipeline::PipelineResult;pub use validation::ConfigValidator;
Modules§
- config
- Distillation configuration parsing and management.
- pipeline
- Distillation pipeline execution (Heijunka - level scheduling).
- validation
- Configuration validation (Jidoka - built-in quality).
Structs§
- Memory
Estimate - Memory estimation result.
Functions§
- estimate_
memory - Estimate memory requirements without running training.
- run
- Run the distillation pipeline with the given configuration.