oxicuda-quant 0.1.3

GPU-accelerated quantization and model compression engine for OxiCUDA
Documentation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
//! # Knowledge Distillation
//!
//! Compress models by transferring knowledge from a large teacher to a small student.
//!
//! | Module     | Contents                                             |
//! |------------|------------------------------------------------------|
//! | `loss`     | [`DistilLoss`] — KL, MSE, cosine, combined losses   |
//! | `response` | [`ResponseDistiller`] — soft + hard label training  |
//! | `feature`  | [`FeatureDistiller`] — intermediate activation matching |

pub mod feature;
pub mod loss;
pub mod response;

pub use feature::FeatureDistiller;
pub use loss::{DistilLoss, DistilLossType};
pub use response::ResponseDistiller;