tensorlogic-quantrs-hooks
Probabilistic Graphical Model Integration for TensorLogic
Bridge between logic-based reasoning and probabilistic inference through factor graphs, belief propagation, variational methods, quantum circuit integration, and tensor network optimization.
Overview
tensorlogic-quantrs-hooks enables probabilistic reasoning over TensorLogic expressions by converting logical rules into factor graphs and applying state-of-the-art inference algorithms. This crate seamlessly integrates with the QuantRS2 ecosystem for probabilistic programming.
Key Features
- TLExpr → Factor Graph Conversion: Automatic translation of logical expressions to PGM representations
- Exact Inference:
- Sum-product and max-product belief propagation for tree-structured graphs
- Parallel sum-product with rayon for large-scale graphs (near-linear scaling)
- Junction tree algorithm for exact inference on arbitrary graphs
- Variable elimination with 5 advanced ordering heuristics (MinDegree, MinFill, WeightedMinFill, MinWidth, MaxCardinalitySearch)
- Approximate Inference:
- Loopy BP: Message passing for graphs with cycles, with damping and convergence detection
- Variational Inference: Mean-field, Bethe approximation, and tree-reweighted BP
- Expectation Propagation (EP): Moment matching with site approximations for discrete and continuous variables
- MCMC Sampling: Gibbs sampling for approximate posterior computation
- Importance Sampling and Particle Filters:
- ImportanceSampler with custom proposal distributions
- Self-normalized importance sampling
- Effective sample size (ESS) computation
- LikelihoodWeighting for Bayesian networks
- ParticleFilter (Sequential Monte Carlo) with systematic resampling
- Dynamic Bayesian Networks:
- DynamicBayesianNetwork with state/observation variables
- DBN unrolling to static FactorGraph
- Filtering and smoothing
- Viterbi decoding (MAP sequence)
- DBNBuilder for fluent construction
- CoupledDBN for interacting processes
- Influence Diagrams (Decision Networks):
- InfluenceDiagram with chance/decision/utility nodes
- Expected utility computation
- Optimal policy finding (exhaustive search)
- Value of perfect information (VPI)
- InfluenceDiagramBuilder for fluent construction
- MultiAttributeUtility (MAUT) support
- Performance Optimizations:
- Factor caching system with LRU eviction for memoization (FactorCache)
- Thread-safe parallel message passing via rayon (ParallelSumProduct, ParallelMaxProduct)
- Cache statistics tracking (hits, misses, hit rate)
- Memory optimization (FactorPool, SparseFactor, LazyFactor, CompressedFactor, BlockSparseFactor)
- Streaming factor graph for large graphs (StreamingFactorGraph)
- QuantRS2 Integration:
- Distribution and model export to QuantRS format
- JSON serialization for ecosystem interoperability
- Information-theoretic utilities (mutual information, KL divergence)
- MCMC sampling hooks, parameter learning interfaces
- Quantum annealing (QuantumAnnealing, QuantumInference)
- Parameter Learning:
- Maximum Likelihood Estimation (MLE) for discrete distributions
- Bayesian estimation with Dirichlet priors
- Baum-Welch algorithm (EM) for Hidden Markov Models
- Forward-backward algorithm implementation
- Sequence Models:
- Linear-chain CRFs for sequence labeling with Viterbi decoding
- Feature functions (transition, emission, custom)
- Forward-backward algorithm for marginal probabilities
- Quantum Circuit Integration (
quantum_circuitmodule):- IsingModel for quadratic unconstrained binary optimization (QUBO)
- QUBOProblem for constraint satisfaction as QUBO
- QAOA (Quantum Approximate Optimization Algorithm) circuit builder
- QAOAConfig and QAOAResult for QAOA parameterization
- tlexpr_to_qaoa_circuit conversion
- Quantum Simulation (
quantum_simulationmodule):- QuantumSimulationBackend for simulated quantum computation
- SimulatedState for tracking quantum states
- SimulationConfig for backend configuration
- run_qaoa function for full QAOA simulation
- Tensor Network Bridge (
tensor_network_bridgemodule):- TensorNetwork for tensor contraction networks
- MatrixProductState (MPS) for efficient 1D chain representations
- factor_graph_to_tensor_network conversion
- linear_chain_to_mps conversion
- TensorNetworkStats for network analysis
- Quality Assurance:
- Property-based testing with proptest (14 property tests)
- Comprehensive benchmark suite with criterion (50+ benchmarks across 3 suites)
- 193+ tests (100% pass rate for non-precision-limited tests)
- 4 tests ignored with documented precision investigation notes
- Full SciRS2 Integration: All tensor operations use SciRS2 for performance and consistency
Quick Start
Basic Factor Graph Creation
use ;
use Array;
// Create factor graph
let mut graph = new;
// Add binary variables
graph.add_variable_with_card;
graph.add_variable_with_card;
// Add factor P(x)
let px_values = from_shape_vec
.unwrap
.into_dyn;
let px = new.unwrap;
graph.add_factor.unwrap;
Converting TLExpr to Factor Graph
use expr_to_factor_graph;
use ;
let expr = and;
let graph = expr_to_factor_graph?;
println!;
Belief Propagation
use ;
let algorithm = new;
let marginals = algorithm.run?;
for in &marginals
Junction Tree (Exact Inference)
use JunctionTree;
let jt = from_factor_graph?;
let marginals = jt.compute_marginals?;
Variable Elimination
use ;
let ve = new;
let result = ve.marginal?;
Bayesian Networks
use BayesianNetwork;
let mut bn = new;
bn.add_node?;
bn.add_node?;
bn.add_edge?;
// Set CPDs and run inference
let fg = bn.to_factor_graph?;
Hidden Markov Models
use HiddenMarkovModel;
let mut hmm = new; // 3 states, 2 observations
// Set transition, emission, initial probabilities
// Run filtering, smoothing, Viterbi
let viterbi = hmm.viterbi?;
Particle Filter
use ;
let proposal = uniform;
let mut pf = new;
for observation in &observations
Dynamic Bayesian Networks
use ;
let dbn = new
.add_state_variable
.add_observation_variable
.build?;
// Unroll to static factor graph for T time steps
let unrolled = dbn.unroll?;
// Viterbi decoding
let best_sequence = dbn.viterbi?;
Influence Diagrams
use ;
let diagram = new
.add_node
.add_node
.add_node
.build?;
let = diagram.find_optimal_policy?;
let vpi = diagram.value_of_perfect_information?;
Quantum Circuit (QAOA)
use ;
// Convert TLExpr satisfiability to QUBO
let circuit = tlexpr_to_qaoa_circuit?;
// Configure QAOA
let config = QAOAConfig ;
let ising = from_qubo;
let builder = new;
Tensor Network Bridge
use ;
// Convert factor graph to tensor network for efficient contraction
let tn = factor_graph_to_tensor_network?;
let stats = from_network;
// Represent a linear chain as Matrix Product State
let mps = linear_chain_to_mps?;
Memory-Efficient Large Graphs
use ;
// Pool-based memory allocation
let pool = new;
// Sparse factors for near-zero entries
let sparse = from_factor?;
// Streaming for graphs too large for memory
let mut streaming = new;
streaming.process_batch?;
Supported Inference Algorithms
| Algorithm | Type | Best For |
|---|---|---|
| SumProductAlgorithm | Exact (trees) | Tree-structured graphs |
| MaxProductAlgorithm | MAP (trees) | MAP inference on trees |
| ParallelSumProduct | Exact (parallel) | Large tree graphs |
| VariableElimination | Exact | Small graphs, any structure |
| JunctionTree | Exact | Arbitrary graphs |
| MeanFieldInference | Approximate | Large, dense graphs |
| BetheApproximation | Approximate | Loopy graphs |
| TreeReweightedBP | Approximate | Graphs with cycles |
| ExpectationPropagation | Approximate | Continuous or complex factors |
| GibbsSampler | MCMC | Complex posteriors |
| ImportanceSampler | Monte Carlo | Custom proposals |
| ParticleFilter | Sequential MC | Time series / DBNs |
| LikelihoodWeighting | Monte Carlo | Bayesian network evidence |
Elimination Ordering Strategies
| Strategy | Description |
|---|---|
| MinDegree | Minimize degree of each eliminated variable |
| MinFill | Minimize edges added during elimination |
| WeightedMinFill | MinFill weighted by factor sizes |
| MinWidth | Minimize maximum clique width |
| MaxCardinalitySearch | Greedy cardinality ordering |
Testing
# 193+ tests, all applicable tests passing
Benchmarking
Benchmark suites:
- Factor operations (6 benchmark groups)
- Message passing (7 benchmark groups)
- Inference algorithms comparison (9 benchmark groups)
- Total: 50+ benchmarks
Examples
8 comprehensive examples:
# Bayesian Network inference (Student Performance Model)
# HMM temporal inference (Weather Prediction)
# Junction Tree exact inference
# QuantRS2 integration
# Parameter learning (Baum-Welch)
# Structured variational inference
# Expectation Propagation
# Linear-chain CRF
Architecture
TLExpr → FactorGraph → Inference → Marginals
↓ ↓ ↓ ↓
Predicates Factors Einsum Ops Probabilities
↓
[Multiple Algorithms]
SumProduct / JunctionTree /
VariableElimination /
MeanField / EP / Gibbs /
ImportanceSampling / ParticleFilter
FactorGraph → TensorNetwork → MatrixProductState
↓
Contraction
License
Apache-2.0
Status: Production Ready (v0.1.0-rc.1) Last Updated: 2026-03-06 Tests: 193+ passing (100% pass rate for non-precision-limited tests) Benchmarks: 3 suites, 50+ benchmarks Examples: 8 comprehensive examples Part of: TensorLogic Ecosystem