ModuleSnapshot

Trait ModuleSnapshot 

Source
pub trait ModuleSnapshot<B: Backend>: Module<B> {
    // Provided methods
    fn collect(
        &self,
        filter: Option<PathFilter>,
        adapter: Option<Box<dyn ModuleAdapter>>,
        skip_enum_variants: bool,
    ) -> Vec<TensorSnapshot> { ... }
    fn apply(
        &mut self,
        snapshots: Vec<TensorSnapshot>,
        filter: Option<PathFilter>,
        adapter: Option<Box<dyn ModuleAdapter>>,
        skip_enum_variants: bool,
    ) -> ApplyResult
       where Self: Sized { ... }
    fn save_into<P>(&self, store: &mut P) -> Result<(), P::Error>
       where P: ModuleStore { ... }
    fn load_from<P>(&mut self, store: &mut P) -> Result<ApplyResult, P::Error>
       where P: ModuleStore { ... }
}
Expand description

Extension trait for modules that provides tensor storage functionality.

This trait provides convenient methods to collect and apply tensor snapshots from any Burn module. Collection operations create lightweight tensor snapshots without immediately copying data. Apply operations apply tensor data from snapshots to the corresponding tensors in the module.

Provided Methods§

Source

fn collect( &self, filter: Option<PathFilter>, adapter: Option<Box<dyn ModuleAdapter>>, skip_enum_variants: bool, ) -> Vec<TensorSnapshot>

Collects tensor snapshots for inspection without copying data.

Returns a vector of TensorSnapshot objects that can lazily materialize the tensor data. Each TensorSnapshot contains the full path accessible via snapshot.full_path().

§Arguments
  • filter - An optional PathFilter to determine which tensors to collect. When None, all tensors are collected.
  • adapter - Optional adapter to transform tensors based on container types. Applied to all collected tensors before returning.
  • skip_enum_variants - Skip enum variant names when building paths. When true, paths will not include enum variant names (e.g., “feature.weight” instead of “feature.BaseConv.weight”). Useful when exporting to formats like PyTorch/SafeTensors that don’t use enum variants.
Source

fn apply( &mut self, snapshots: Vec<TensorSnapshot>, filter: Option<PathFilter>, adapter: Option<Box<dyn ModuleAdapter>>, skip_enum_variants: bool, ) -> ApplyResult
where Self: Sized,

Applies tensor snapshots to the module.

This is the primary apply method that applies tensor data from TensorSnapshots to the corresponding tensors in the module. The snapshots are typically obtained from collect() or loaded from storage.

§Arguments
  • snapshots - A vector of TensorSnapshot objects
  • filter - An optional PathFilter to determine which tensors to apply. When None, all available tensors are applied.
  • adapter - Optional adapter to transform tensors based on container types
  • skip_enum_variants - Skip enum variant names when matching tensor paths
§Returns

An ApplyResult containing information about applied, skipped, missing, and unused tensors, as well as any errors encountered.

§Examples
use burn_store::PathFilter;

// Apply all tensors
let result = model.apply(snapshots, None, None, false);

// Apply only encoder tensors
let filter = PathFilter::new().with_regex(r"^encoder\..*");
let result = model.apply(snapshots, Some(filter), None, false);

// Apply with complex filter
let filter = PathFilter::new()
    .with_regex(r"^encoder\..*")
    .with_regex(r"^decoder\..*")
    .with_full_path("head.weight");
let result = model.apply(snapshots, Some(filter), None, false);

// Apply with enum variant skipping (for PyTorch models)
let result = model.apply(snapshots, None, None, true);
Source

fn save_into<P>(&self, store: &mut P) -> Result<(), P::Error>
where P: ModuleStore,

Saves tensor snapshots into a ModuleStore.

This method allows using a ModuleStore implementation to handle the collection and writing logic in a configurable way.

§Arguments
  • store - A mutable reference to a ModuleStore that will collect and save the tensors
Source

fn load_from<P>(&mut self, store: &mut P) -> Result<ApplyResult, P::Error>
where P: ModuleStore,

Loads tensor data from a ModuleStore.

This method allows using a ModuleStore implementation to handle the loading and application logic in a configurable way.

§Arguments
  • store - A mutable reference to a ModuleStore that will load and apply tensors

Dyn Compatibility§

This trait is not dyn compatible.

In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.

Implementors§

Source§

impl<B: Backend, M: Module<B>> ModuleSnapshot<B> for M