Module tensor_networks

Module tensor_networks 

Source
Expand description

Tensor Networks

Efficient representations of high-dimensional tensors using network decompositions.

§Background

High-dimensional tensors suffer from the “curse of dimensionality” - a tensor of order d with mode sizes n has O(n^d) elements. Tensor networks provide compressed representations with controllable approximation error.

§Decompositions

  • Tensor Train (TT): A[i1,…,id] = G1[i1] × G2[i2] × … × Gd[id]
  • Tucker: Core tensor with factor matrices
  • CP (CANDECOMP/PARAFAC): Sum of rank-1 tensors

§Applications

  • Quantum-inspired algorithms
  • High-dimensional integration
  • Attention mechanism compression
  • Scientific computing

Structs§

CPConfig
CP decomposition configuration
CPDecomposition
CP decomposition result
DenseTensor
Dense tensor for input/output
NetworkContraction
Optimal contraction order finder
TTCore
A single TT-core: 3D tensor of shape (rank_left, mode_size, rank_right)
TensorNetwork
Tensor network for contraction operations
TensorNode
A node in a tensor network
TensorTrain
Tensor Train representation
TensorTrainConfig
Tensor Train configuration
TuckerConfig
Tucker decomposition configuration
TuckerDecomposition
Tucker decomposition of a tensor