Skip to main content

Module pruning

Module pruning 

Source
Expand description

Model pruning for sparse neural networks

Pruning removes unnecessary weights and connections from neural networks, reducing model size and improving inference speed.

§Unstructured Pruning

This module provides comprehensive unstructured (element-wise) pruning with:

  • Multiple importance methods: L1 norm, L2 norm, gradient-based, Taylor expansion, random
  • Flexible mask creation: By threshold or percentage
  • Iterative pruning: Gradual sparsity increase with fine-tuning
  • Lottery Ticket Hypothesis: Weight rewinding for finding winning tickets

§Example: Basic Unstructured Pruning

use oxigdal_ml::optimization::pruning::{
    UnstructuredPruner, WeightTensor, ImportanceMethod, PruningConfig
};
use oxigdal_ml::error::Result;

// Create weight tensor
let weights = WeightTensor::new(
    vec![0.1, -0.5, 0.3, -0.8, 0.2, -0.1, 0.7, -0.4],
    vec![2, 4],
    "layer1.weight".to_string(),
);

// Create pruner with magnitude-based importance
let config = PruningConfig::builder()
    .sparsity_target(0.5)
    .build();
let mut pruner = UnstructuredPruner::new(config, ImportanceMethod::L1Norm);

// Prune weights
let (pruned_weights, mask) = pruner.prune_tensor(&weights)?;

§Example: Lottery Ticket Hypothesis

use oxigdal_ml::optimization::pruning::{
    UnstructuredPruner, WeightTensor, ImportanceMethod, PruningConfig, PruningSchedule,
};

// Initial weights before training
let initial_weights = vec![
    WeightTensor::new(vec![0.1, 0.2, 0.3, 0.4], vec![2, 2], "layer1".to_string()),
];

// Create pruner with lottery ticket support
let config = PruningConfig::builder()
    .sparsity_target(0.5)
    .schedule(PruningSchedule::Iterative { iterations: 3 })
    .build();
let mut pruner = UnstructuredPruner::new(config, ImportanceMethod::L1Norm);

// Enable lottery ticket rewinding
pruner.enable_lottery_ticket(initial_weights);

// After training, you can rewind to initial weights with learned mask
if let Some(rewound) = pruner.rewind_to_initial() {
    // Use rewound weights for training from scratch
}

Structs§

GradientInfo
Gradient information for gradient-based pruning methods
LotteryTicketState
State for Lottery Ticket Hypothesis rewinding
NoOpFineTune
No-op fine-tuning callback (skips fine-tuning)
PruningConfig
Pruning configuration
PruningConfigBuilder
Builder for pruning configuration
PruningMask
Pruning mask indicating which weights are kept or pruned
PruningStats
Pruning statistics
UnstructuredPruner
Unstructured pruner for element-wise weight removal
WeightStatistics
Statistics about weight distribution
WeightTensor
Weight tensor for pruning operations

Enums§

ImportanceMethod
Importance score computation method for unstructured pruning
MaskCreationMode
Mask creation mode for pruning
PruningGranularity
Pruning granularity
PruningSchedule
Pruning schedule
PruningStrategy
Pruning strategy

Traits§

FineTuneCallback
Fine-tuning callback for iterative pruning

Functions§

compute_channel_importance
Computes channel importance for structured pruning
compute_gradient_importance
Computes importance scores using gradient information
compute_magnitude_importance
Computes importance scores for weights using magnitude
compute_taylor_importance
Applies pruning with Taylor expansion-based importance
iterative_pruning
Applies iterative pruning with gradual sparsity increase
prune_model
Prunes a model according to the configuration
prune_weights_direct
Prunes weight tensors directly (in-memory operation)
prune_weights_with_gradients
Prunes weight tensors with gradient information
select_weights_to_prune
Selects weights to prune based on importance scores
structured_pruning
Performs structured pruning (removes entire filters/channels)
unstructured_pruning
Performs unstructured pruning (removes individual weights)