DecodingState

Struct DecodingState 

Source
pub struct DecodingState<'a, T: Topology, const STRIDE_Y: usize> {
Show 31 fields pub graph: &'a StaticGraph, pub width: usize, pub height: usize, pub stride_y: usize, pub row_start_mask: u64, pub row_end_mask: u64, pub blocks_state: &'a mut [BlockStateHot], pub parents: &'a mut [u32], pub defect_mask: &'a mut [u64], pub path_mark: &'a mut [u64], pub block_dirty_mask: &'a mut [u64], pub active_mask: &'a mut [u64], pub queued_mask: &'a mut [u64], pub active_block_mask: u64, pub ingestion_list: &'a mut [u32], pub ingestion_count: usize, pub edge_bitmap: &'a mut [u64], pub edge_dirty_list: &'a mut [u32], pub edge_dirty_count: usize, pub edge_dirty_mask: &'a mut [u64], pub boundary_bitmap: &'a mut [u64], pub boundary_dirty_list: &'a mut [u32], pub boundary_dirty_count: usize, pub boundary_dirty_mask: &'a mut [u64], pub bfs_pred: &'a mut [u16], pub bfs_queue: &'a mut [u16], pub needs_scalar_fallback: bool, pub scalar_fallback_mask: u64, pub boundary_config: BoundaryConfig, pub parent_offset: usize, pub _marker: PhantomData<T>,
}
Expand description

Main decoder state structure for Union Find QEC decoding.

This structure contains all the state needed for a single decoding instance. It is parameterized by:

  • 'a - Lifetime of the backing arena memory
  • T: Topology - The lattice topology (e.g., SquareGrid)
  • STRIDE_Y - Compile-time Y stride for performance optimization

§Memory Layout

All slices are allocated from an Arena and are contiguous. The decoder uses Morton (Z-order) encoding to organize nodes into 64-node blocks for cache efficiency.

§Usage

use prav_core::{Arena, DecodingState, SquareGrid, EdgeCorrection};

let mut buffer = [0u8; 1024 * 1024];
let mut arena = Arena::new(&mut buffer);

// Create decoder with STRIDE_Y matching grid dimensions
let mut state: DecodingState<SquareGrid, 32> = DecodingState::new(&mut arena, 32, 32, 1);

// Load syndromes and decode
state.load_dense_syndromes(&syndromes);
state.grow_clusters();

let mut corrections = [EdgeCorrection::default(); 1024];
let count = state.peel_forest(&mut corrections);

§Thread Safety

DecodingState is not thread-safe. For parallel decoding, create one instance per thread with separate arenas.

Fields§

§graph: &'a StaticGraph

Reference to static graph metadata (dimensions, strides).

§width: usize

Grid width in nodes.

§height: usize

Grid height in nodes.

§stride_y: usize

Y stride for coordinate calculations (power of 2 for fast division).

§row_start_mask: u64

Bitmask identifying nodes at the start of their row within a block.

§row_end_mask: u64

Bitmask identifying nodes at the end of their row within a block.

§blocks_state: &'a mut [BlockStateHot]

Block state for each 64-node block (boundary, occupied, masks, cached root).

§parents: &'a mut [u32]

Union Find parent pointers. parents[i] == i means node i is a root.

§defect_mask: &'a mut [u64]

Bitmask of defect (syndrome) nodes per block.

§path_mark: &'a mut [u64]

Path marking bitmask used during peeling.

§block_dirty_mask: &'a mut [u64]

Bitmask tracking which blocks have been modified (for sparse reset).

§active_mask: &'a mut [u64]

Bitmask of currently active blocks for this growth iteration.

§queued_mask: &'a mut [u64]

Bitmask of blocks queued for the next growth iteration.

§active_block_mask: u64

Fast-path active mask for small grids (<=64 blocks).

§ingestion_list: &'a mut [u32]

List of block indices with syndromes to process.

§ingestion_count: usize

Number of valid entries in ingestion_list.

§edge_bitmap: &'a mut [u64]

Bitmask of edges to include in corrections (XOR-accumulated).

§edge_dirty_list: &'a mut [u32]

List of edge bitmap word indices that have been modified.

§edge_dirty_count: usize

Number of valid entries in edge_dirty_list.

§edge_dirty_mask: &'a mut [u64]

Bitmask tracking which edge_bitmap words are dirty.

§boundary_bitmap: &'a mut [u64]

Bitmask of boundary corrections per block.

§boundary_dirty_list: &'a mut [u32]

List of block indices with boundary corrections.

§boundary_dirty_count: usize

Number of valid entries in boundary_dirty_list.

§boundary_dirty_mask: &'a mut [u64]

Bitmask tracking which boundary_bitmap entries are dirty.

§bfs_pred: &'a mut [u16]

BFS predecessor array for path tracing.

§bfs_queue: &'a mut [u16]

BFS queue for path tracing.

§needs_scalar_fallback: bool

Flag indicating scalar fallback is needed for some blocks.

§scalar_fallback_mask: u64

Bitmask of blocks requiring scalar processing.

§boundary_config: BoundaryConfig

Configuration for boundary matching behavior.

§parent_offset: usize

Offset for parent array (used in some optimizations).

§_marker: PhantomData<T>

Phantom marker for the topology type parameter.

Implementations§

Source§

impl<'a, T: Topology, const STRIDE_Y: usize> DecodingState<'a, T, STRIDE_Y>

Source

pub unsafe fn process_block_small_stride<const SILENT: bool>( &mut self, blk_idx: usize, ) -> bool

§Safety

Caller must ensure:

  • blk_idx is within bounds of self.blocks_state
  • All internal state arrays are properly initialized
Source§

impl<'a, T: Topology, const STRIDE_Y: usize> DecodingState<'a, T, STRIDE_Y>

Source

pub unsafe fn process_block_small_stride_32<const SILENT: bool>( blk_idx: usize, parents: &mut [u32], blocks_state: &mut [BlockStateHot], defect_mask: &mut [u64], block_dirty_mask: &mut [u64], queued_mask: &mut [u64], is_small_grid: bool, block_offset: usize, ) -> bool

§Safety

Caller must ensure:

  • blk_idx is within bounds of blocks_state
  • parents, blocks_state, defect_mask, block_dirty_mask, queued_mask are valid
  • block_offset + blk_idx corresponds to valid parent array indices
Source§

impl<'a, T: Topology, const STRIDE_Y: usize> DecodingState<'a, T, STRIDE_Y>

Source

pub unsafe fn process_all_blocks_stride_32_unrolled_16(&mut self) -> (bool, u64)

Unrolled processing for exactly 16 blocks (1024 nodes) with STRIDE_Y=32. Returns (any_expanded, next_active_mask).

§Safety

Caller must ensure the decoder state is properly initialized with exactly 16 blocks.

Source

pub unsafe fn process_all_blocks_stride_64(&mut self) -> (bool, u64)

Optimized processing for stride-64 grids (64x64). Returns (any_expanded, next_active_mask).

§Safety

Caller must ensure the decoder state is properly initialized for stride-64 processing.

Source

pub unsafe fn process_block_optimized_32<const SILENT: bool>( &mut self, _blk_idx: usize, ) -> bool

§Safety

This function is deprecated and always returns false.

Source§

impl<'a, T: Topology, const STRIDE_Y: usize> DecodingState<'a, T, STRIDE_Y>

Source

pub fn new( arena: &mut Arena<'a>, width: usize, height: usize, depth: usize, ) -> Self

Creates a new decoder state for the given grid dimensions.

Allocates all necessary data structures from the provided arena. The decoder is initialized and ready for use after construction.

§Arguments
  • arena - Arena allocator to use for all internal allocations.
  • width - Grid width in nodes.
  • height - Grid height in nodes.
  • depth - Grid depth (1 for 2D codes, >1 for 3D codes).
§Panics

Panics if STRIDE_Y doesn’t match the calculated stride for the given dimensions. The stride is max(width, height, depth).next_power_of_two().

§Memory Requirements

The arena must have sufficient space for:

  • num_nodes * 4 bytes for parent array
  • num_blocks * 64 bytes for block states
  • Additional space for masks, bitmaps, and queues

A safe estimate is num_nodes * 20 bytes. Use required_buffer_size for exact calculation.

Source

pub fn initialize_internal(&mut self)

Resets all internal state for a new decoding cycle.

This is a full reset that clears all dynamic state while preserving the grid topology (valid_mask). Call this before loading new syndromes when you want to completely reset the decoder.

§Performance Note

For repeated decoding with sparse syndromes, prefer sparse_reset which only resets modified blocks (O(modified) vs O(n)).

Source

pub fn load_erasures(&mut self, erasures: &[u64])

Loads erasure information indicating which qubits were lost.

Erasures represent qubits that could not be measured (e.g., photon loss). The decoder excludes erased qubits from cluster growth.

§Arguments
  • erasures - Dense bitarray where bit i in erasures[blk] indicates node (blk * 64 + i) is erased.
§Effect

Updates effective_mask = valid_mask & !erasure_mask for each block, which controls which nodes participate in cluster growth.

Source

pub fn mark_block_dirty(&mut self, blk_idx: usize)

Marks a block as modified for sparse reset tracking.

Called internally when a block’s state changes. Marked blocks will be reset during the next sparse_reset call.

Source

pub fn is_small_grid(&self) -> bool

Checks if this grid qualifies for small-grid optimizations.

Small grids (<=64 blocks, i.e., <=4096 nodes) use a single u64 bitmask for active block tracking, enabling faster iteration.

§Returns

true if the grid has at most 65 blocks (64 data + 1 boundary sentinel).

Source

pub fn push_next(&mut self, blk_idx: usize)

Queues a block for processing in the next growth iteration.

Sets the corresponding bit in queued_mask. The block will be processed when active_mask and queued_mask are swapped at the end of the current iteration.

Source

pub fn sparse_reset(&mut self)

Resets only the blocks that were modified, enabling efficient reuse.

At typical error rates (p=0.001-0.06), only a small fraction of blocks are modified during decoding. This method resets only those blocks, achieving O(modified) complexity instead of O(total).

§When to Use

Call this between decoding cycles when:

  • Error rate is low (most blocks unmodified)
  • You want to minimize reset overhead

For high error rates or when topology changes, use initialize_internal instead.

§What Gets Reset

For each dirty block:

  • boundary, occupied, root, root_rank in BlockStateHot
  • defect_mask entry
  • Parent pointers for all 64 nodes in the block
Source

pub fn reset_for_next_cycle(&mut self)

Resets state for the next decoding cycle (sparse reset).

This efficiently resets only the blocks that were modified during the previous decoding cycle. At typical error rates (p < 0.1), this is significantly faster than full_reset.

Use this method between consecutive decoding cycles when the grid topology remains unchanged.

§Complexity

O(modified_blocks) where modified_blocks << total_blocks for typical error rates.

§Example
for cycle in 0..1000 {
    decoder.reset_for_next_cycle();
    decoder.load_dense_syndromes(&syndromes[cycle]);
    let count = decoder.decode(&mut corrections);
}
Source

pub fn full_reset(&mut self)

Fully resets all decoder state.

This performs a complete reset of all internal data structures, suitable for when:

  • Starting fresh with a completely new problem
  • The grid topology has changed
  • You want guaranteed clean state

For repeated decoding with similar syndrome patterns, prefer reset_for_next_cycle which is faster.

§Complexity

O(n) where n is the total number of nodes.

Source§

impl<'a, T: Topology, const STRIDE_Y: usize> DecodingState<'a, T, STRIDE_Y>

Source

pub fn find(&mut self, i: u32) -> u32

Finds the cluster root for node i. See UnionFind::find.

Source

pub unsafe fn union(&mut self, u: u32, v: u32) -> bool

Merges clusters containing nodes u and v.

See UnionFind::union for details.

§Safety

Caller must ensure u and v are valid node indices within the parents array.

Source

pub fn load_dense_syndromes(&mut self, syndromes: &[u64])

Loads syndrome measurements. See ClusterGrowth::load_dense_syndromes.

Source

pub fn grow_clusters(&mut self)

Expands clusters until convergence. See ClusterGrowth::grow_clusters.

Source

pub unsafe fn process_block(&mut self, blk_idx: usize) -> bool

Processes a single block during growth.

See ClusterGrowth::process_block for details.

§Safety

Caller must ensure blk_idx is within bounds of the internal blocks state.

Source

pub fn decode(&mut self, corrections: &mut [EdgeCorrection]) -> usize

Full decode: growth + peeling. See Peeling::decode.

Source

pub fn peel_forest(&mut self, corrections: &mut [EdgeCorrection]) -> usize

Extracts corrections from grown clusters. See Peeling::peel_forest.

Trait Implementations§

Source§

impl<'a, T: Topology, const STRIDE_Y: usize> ClusterGrowth for DecodingState<'a, T, STRIDE_Y>

Source§

fn load_dense_syndromes(&mut self, syndromes: &[u64])

Loads syndrome measurements from a dense bitarray. Read more
Source§

fn grow_clusters(&mut self)

Expands cluster boundaries until convergence. Read more
Source§

fn grow_iteration(&mut self) -> bool

Performs a single iteration of cluster growth. Read more
Source§

unsafe fn process_block(&mut self, blk_idx: usize) -> bool

Processes a single block during cluster growth. Read more
Source§

impl<'a, T: Topology, const STRIDE_Y: usize> Peeling for DecodingState<'a, T, STRIDE_Y>

Source§

fn decode(&mut self, corrections: &mut [EdgeCorrection]) -> usize

Performs full decoding: cluster growth followed by peeling. Read more
Source§

fn peel_forest(&mut self, corrections: &mut [EdgeCorrection]) -> usize

Extracts corrections from the grown cluster forest. Read more
Source§

fn reconstruct_corrections( &mut self, corrections: &mut [EdgeCorrection], ) -> usize

Converts marked edges to EdgeCorrection structures. Read more
Source§

fn trace_path(&mut self, u: u32, _boundary_node: u32)

Traces a path from node u toward the boundary through parent pointers. Read more
Source§

fn trace_bfs(&mut self, u: u32, v: u32, mask: u64)

Traces a path using BFS within a single block. Read more
Source§

fn trace_bitmask_bfs(&mut self, start_node: u32)

Traces paths using bitmask-based BFS for small grids. Read more
Source§

fn trace_manhattan(&mut self, u: u32, v: u32)

Traces a Manhattan (axis-aligned) path between two nodes. Read more
Source§

fn emit_linear(&mut self, u: u32, v: u32)

Emits a single edge correction between two nodes. Read more
Source§

fn get_coord(&self, u: u32) -> (usize, usize, usize)

Converts a linear node index to (x, y, z) coordinates. Read more
Source§

impl<'a, T: Topology, const STRIDE_Y: usize> UnionFind for DecodingState<'a, T, STRIDE_Y>

Source§

fn find(&mut self, i: u32) -> u32

Finds the root (cluster representative) of the node i. Read more
Source§

unsafe fn union_roots(&mut self, root_u: u32, root_v: u32) -> bool

Merges two clusters given their root nodes. Read more
Source§

unsafe fn union(&mut self, u: u32, v: u32) -> bool

Merges the clusters containing nodes u and v. Read more

Auto Trait Implementations§

§

impl<'a, T, const STRIDE_Y: usize> Freeze for DecodingState<'a, T, STRIDE_Y>

§

impl<'a, T, const STRIDE_Y: usize> RefUnwindSafe for DecodingState<'a, T, STRIDE_Y>
where T: RefUnwindSafe,

§

impl<'a, T, const STRIDE_Y: usize> Send for DecodingState<'a, T, STRIDE_Y>
where T: Send,

§

impl<'a, T, const STRIDE_Y: usize> Sync for DecodingState<'a, T, STRIDE_Y>
where T: Sync,

§

impl<'a, T, const STRIDE_Y: usize> Unpin for DecodingState<'a, T, STRIDE_Y>
where T: Unpin,

§

impl<'a, T, const STRIDE_Y: usize> !UnwindSafe for DecodingState<'a, T, STRIDE_Y>

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.