Expand description
Differentiable LP layer via entropic regularization and basis sensitivity.
Two complementary approaches:
-
Perturbed LP (
LpLayer): Solves the entropy-regularized LPmin cᵀx + ε Σ x_i ln(x_i) s.t. Ax ≤ b, x ≥ 0via Sinkhorn-style iterative updates. The entropic regularization makes the solution unique and smooth in c, enabling backpropagation. -
Basis sensitivity (
LpSensitivity): For an LP in standard form with optimal basis B, computes the exact sensitivity of the optimal basic variables to changes in the cost vector c and rhs b.
§References
- Berthet et al. (2020). “Learning with Differentiable Perturbed Optimizers.” NeurIPS.
- Murtagh & Saunders (1978). “Large-scale linearly constrained optimization.” Mathematical Programming.
Structs§
- LpLayer
- A differentiable LP layer using entropic regularization.
- LpLayer
Config - Configuration for the LP layer.
- LpSensitivity
- Exact sensitivity analysis for LP in standard form.
Functions§
- lp_
gradient - Compute the gradient dL/dc for the entropy-regularized LP.
- lp_
perturbed - Solve the entropy-regularized LP: