1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
//! Neural network pruning integration for Entrenar
//!
//! This module provides training-time pruning capabilities that integrate
//! with the Aprender pruning primitives. It implements:
//!
//! - **Pruning Schedules**: OneShot, Gradual, and Cubic sparsity schedules
//! - **Pruning Callback**: Integration with the training loop
//! - **Calibration Pipeline**: Activation collection for Wanda/SparseGPT
//! - **Prune-Finetune Pipeline**: End-to-end pruning workflow
//!
//! # Toyota Way Principles
//!
//! - **Kaizen** (Continuous Improvement): Gradual pruning schedules
//! - **Jidoka** (Quality at Source): Validates sparsity at each step
//! - **Genchi Genbutsu** (Go and See): Uses real activation data
//!
//! # Example
//!
//! ```ignore
//! use entrenar::prune::{PruningSchedule, PruningCallback, PruningConfig};
//! use aprender::pruning::{MagnitudeImportance, WandaPruner};
//!
//! let schedule = PruningSchedule::Gradual {
//! start_step: 1000,
//! end_step: 10000,
//! initial_sparsity: 0.0,
//! final_sparsity: 0.5,
//! frequency: 100,
//! };
//!
//! let config = PruningConfig::default()
//! .with_schedule(schedule)
//! .with_target_sparsity(0.5);
//!
//! let callback = PruningCallback::new(config);
//! trainer.add_callback(callback);
//! ```
//!
//! # References
//!
//! - Han, S., et al. (2015). Learning both weights and connections. NeurIPS.
//! - Sun, M., et al. (2023). A simple and effective pruning approach. arXiv:2306.11695.
//! - Zhu, M., & Gupta, S. (2017). To prune, or not to prune. arXiv:1710.01878.
pub use ;
pub use PruningCallback;
pub use ;
pub use ;
pub use ;
pub use PruningSchedule;
pub use ;