Skip to main content

Module lp_layer

Module lp_layer 

Source
Expand description

Differentiable LP layer via entropic regularization and basis sensitivity.

Two complementary approaches:

  1. Perturbed LP (LpLayer): Solves the entropy-regularized LP min cᵀx + ε Σ x_i ln(x_i) s.t. Ax ≤ b, x ≥ 0 via Sinkhorn-style iterative updates. The entropic regularization makes the solution unique and smooth in c, enabling backpropagation.

  2. Basis sensitivity (LpSensitivity): For an LP in standard form with optimal basis B, computes the exact sensitivity of the optimal basic variables to changes in the cost vector c and rhs b.

§References

  • Berthet et al. (2020). “Learning with Differentiable Perturbed Optimizers.” NeurIPS.
  • Murtagh & Saunders (1978). “Large-scale linearly constrained optimization.” Mathematical Programming.

Structs§

LpLayer
A differentiable LP layer using entropic regularization.
LpLayerConfig
Configuration for the LP layer.
LpSensitivity
Exact sensitivity analysis for LP in standard form.

Functions§

lp_gradient
Compute the gradient dL/dc for the entropy-regularized LP.
lp_perturbed
Solve the entropy-regularized LP: