Expand description
Model pruning for sparse neural networks
Pruning removes unnecessary weights and connections from neural networks, reducing model size and improving inference speed.
§Unstructured Pruning
This module provides comprehensive unstructured (element-wise) pruning with:
- Multiple importance methods: L1 norm, L2 norm, gradient-based, Taylor expansion, random
- Flexible mask creation: By threshold or percentage
- Iterative pruning: Gradual sparsity increase with fine-tuning
- Lottery Ticket Hypothesis: Weight rewinding for finding winning tickets
§Example: Basic Unstructured Pruning
use oxigdal_ml::optimization::pruning::{
UnstructuredPruner, WeightTensor, ImportanceMethod, PruningConfig
};
use oxigdal_ml::error::Result;
// Create weight tensor
let weights = WeightTensor::new(
vec![0.1, -0.5, 0.3, -0.8, 0.2, -0.1, 0.7, -0.4],
vec![2, 4],
"layer1.weight".to_string(),
);
// Create pruner with magnitude-based importance
let config = PruningConfig::builder()
.sparsity_target(0.5)
.build();
let mut pruner = UnstructuredPruner::new(config, ImportanceMethod::L1Norm);
// Prune weights
let (pruned_weights, mask) = pruner.prune_tensor(&weights)?;§Example: Lottery Ticket Hypothesis
use oxigdal_ml::optimization::pruning::{
UnstructuredPruner, WeightTensor, ImportanceMethod, PruningConfig, PruningSchedule,
};
// Initial weights before training
let initial_weights = vec![
WeightTensor::new(vec![0.1, 0.2, 0.3, 0.4], vec![2, 2], "layer1".to_string()),
];
// Create pruner with lottery ticket support
let config = PruningConfig::builder()
.sparsity_target(0.5)
.schedule(PruningSchedule::Iterative { iterations: 3 })
.build();
let mut pruner = UnstructuredPruner::new(config, ImportanceMethod::L1Norm);
// Enable lottery ticket rewinding
pruner.enable_lottery_ticket(initial_weights);
// After training, you can rewind to initial weights with learned mask
if let Some(rewound) = pruner.rewind_to_initial() {
// Use rewound weights for training from scratch
}Structs§
- Gradient
Info - Gradient information for gradient-based pruning methods
- Lottery
Ticket State - State for Lottery Ticket Hypothesis rewinding
- NoOp
Fine Tune - No-op fine-tuning callback (skips fine-tuning)
- Pruning
Config - Pruning configuration
- Pruning
Config Builder - Builder for pruning configuration
- Pruning
Mask - Pruning mask indicating which weights are kept or pruned
- Pruning
Stats - Pruning statistics
- Unstructured
Pruner - Unstructured pruner for element-wise weight removal
- Weight
Statistics - Statistics about weight distribution
- Weight
Tensor - Weight tensor for pruning operations
Enums§
- Importance
Method - Importance score computation method for unstructured pruning
- Mask
Creation Mode - Mask creation mode for pruning
- Pruning
Granularity - Pruning granularity
- Pruning
Schedule - Pruning schedule
- Pruning
Strategy - Pruning strategy
Traits§
- Fine
Tune Callback - Fine-tuning callback for iterative pruning
Functions§
- compute_
channel_ importance - Computes channel importance for structured pruning
- compute_
gradient_ importance - Computes importance scores using gradient information
- compute_
magnitude_ importance - Computes importance scores for weights using magnitude
- compute_
taylor_ importance - Applies pruning with Taylor expansion-based importance
- iterative_
pruning - Applies iterative pruning with gradual sparsity increase
- prune_
model - Prunes a model according to the configuration
- prune_
weights_ direct - Prunes weight tensors directly (in-memory operation)
- prune_
weights_ with_ gradients - Prunes weight tensors with gradient information
- select_
weights_ to_ prune - Selects weights to prune based on importance scores
- structured_
pruning - Performs structured pruning (removes entire filters/channels)
- unstructured_
pruning - Performs unstructured pruning (removes individual weights)