Expand description
Tensor Networks
Efficient representations of high-dimensional tensors using network decompositions.
§Background
High-dimensional tensors suffer from the “curse of dimensionality” - a tensor of order d with mode sizes n has O(n^d) elements. Tensor networks provide compressed representations with controllable approximation error.
§Decompositions
- Tensor Train (TT): A[i1,…,id] = G1[i1] × G2[i2] × … × Gd[id]
- Tucker: Core tensor with factor matrices
- CP (CANDECOMP/PARAFAC): Sum of rank-1 tensors
§Applications
- Quantum-inspired algorithms
- High-dimensional integration
- Attention mechanism compression
- Scientific computing
Structs§
- CPConfig
- CP decomposition configuration
- CPDecomposition
- CP decomposition result
- Dense
Tensor - Dense tensor for input/output
- Network
Contraction - Optimal contraction order finder
- TTCore
- A single TT-core: 3D tensor of shape (rank_left, mode_size, rank_right)
- Tensor
Network - Tensor network for contraction operations
- Tensor
Node - A node in a tensor network
- Tensor
Train - Tensor Train representation
- Tensor
Train Config - Tensor Train configuration
- Tucker
Config - Tucker decomposition configuration
- Tucker
Decomposition - Tucker decomposition of a tensor