Engee documentation
Notebook

Applying chordal decomposition to improve optimization performance

Introduction

In mathematical optimization problems with positively semi-definite constraints, the dimension of matrix variables often becomes a critical factor limiting computational efficiency. However, many practical tasks are characterized by pronounced sparsity of matrices, which opens up the possibility of using specialized decomposition methods.

Chordal decomposition is a method of decomposing one large PSD constraint into a set of smaller PSD constraints and several constraints in the form of linear equalities.

*The PSD constraint (Positive Semidefinite constraint) is a constraint in mathematical optimization that requires the matrix of variables to be positively semi-definite, in other words, the matrix describes a shape that never gives negative values if used in quadratic form. If the original PSD constraint is sparse, then the decomposed problem can be solved faster than the original one.

The PSD constraint is a guarantee that the quadratic shape given by the matrix is always non-negative. This is a fundamental requirement for the physical feasibility of many systems, from financial models to engineering structures.

This example shows how to use the libraries.:

JuMP, MathOptChordalDecomposition, SCS, SparseArrays

to improve the performance of models with PSD limitations.

Initial data

We will import and attach the necessary libraries.

In [ ]:
import Pkg
Pkg.add(["JuMP", "MathOptChordalDecomposition", "SCS", "SparseArrays"])
using JuMP, MathOptChordalDecomposition, SCS, SparseArrays

To demonstrate the advantages of chordal decomposition, we use a pre-prepared model from the data.dat-s file.

In [ ]:
модель = read_from_file(joinpath(@__DIR__, "data.dat-s"))
Out[0]:
A JuMP Model
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: AffExpr
├ num_variables: 124
├ num_constraints: 1
│ └ Vector{AffExpr} in MOI.PositiveSemidefiniteConeTriangle: 1
└ Names registered in the model: none

This model has 124 decision variables and one PSD constraint. This PSD constraint is sparse, which means that many elements of the matrix are zero.

To view the matrix, use the function all_constraints to get a list of restrictions, and then use the function constraint_object to obtain a constraint form in the form of a function and a set:

In [ ]:
S = MOI.PositiveSemidefiniteConeTriangle
constraints = all_constraints(model, Vector{AffExpr}, S)
ogre = constraint_object(constraints[1]);
ogre.set
Out[0]:
MathOptInterface.PositiveSemidefiniteConeTriangle(124)

The constraints are represented as a vector.

In [ ]:
ogre.func
Out[0]:
7750-element Vector{AffExpr}:
 _[1] - 0.25
 0
 _[2] - 0.5
 0
 0
 _[3] - 0.25
 0
 0
 0
 _[4]
 0
 0
 0
 ⋮
 0
 0
 0
 0
 0
 0
 0
 0
 0
 0
 0
 _[124] - 1

Using the function reshape_vector, we transform the constraints into the form of a matrix.

In [ ]:
F = reshape_vector(ogr.func, SymmetricMatrixShape(ogr.set.side_dimension))
Out[0]:
124×124 LinearAlgebra.Symmetric{AffExpr, Matrix{AffExpr}}:
 _[1] - 0.25  0           0            0     …  0           0
 0            _[2] - 0.5  0            0        0           0
 0            0           _[3] - 0.25  0        0           0
 0            0           0            _[4]     0           0
 0            0           0            0        0           0
 0            0           0            0     …  0           0
 0            0           0            0        0           0
 0            0           0.25         0        0           0
 0            0           0            0        0           0
 0            0           0            0        0           0
 0            0           0            0     …  0           0
 0            0           0            0        0           0
 0            0           0            0        0           0
 ⋮                                           ⋱              
 0            0           0            0        0           0
 0            0           0            0        0           0
 0            0           0            0        0           0
 0            0           0            0     …  0           0
 0            0           0            0        0           0
 0            0           0            0        0           0
 0            0           0            0        0           0
 0            0           0            0        0           0
 0            0           0            0     …  0           0
 0            0           0            0        0           0
 0            0           0            0        _[123] - 1  0
 0            0           0            0        0           _[124] - 1

Using the function SparseArrays.sparse, we transform the matrix F into a sparse matrix A:

In [ ]:
A = SparseArrays.sparse(F)
Out[0]:
124×124 SparseMatrixCSC{AffExpr, Int64} with 422 stored entries:
⎡⠑⢄⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠂⠀⠀⡀⠀⠀⠀⠂⡀⢀⠀⠀⠀⠈⠀⠀⠀⠀⠀⠀⢀⠀⢀⠀⠀⠀⎤
⎢⠈⠀⡑⢌⠀⠠⠀⠀⠈⢀⠀⠠⠃⠀⡀⠀⠀⠀⠀⠀⠄⠀⠀⠀⠐⠠⠀⠄⠀⠀⠀⠀⠀⠠⠀⠀⠀⠀⠄⠀⎥
⎢⠀⠀⠀⡀⠑⢄⠀⠀⠀⠠⠀⠀⠀⠄⠀⠠⢈⠀⠀⠀⠠⠀⠁⠀⠄⠀⠀⠀⠀⠀⠀⠀⠀⠄⠀⠀⠀⠀⠠⠀⎥
⎢⠀⠀⠀⠀⠀⠀⠑⢄⠀⠀⠀⠀⠀⠐⡀⠁⠀⡂⠀⠀⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠂⣀⠀⠀⠠⠠⠀⠑⎥
⎢⠀⠀⠂⢀⠀⡀⠀⠀⠑⢄⠀⠀⠡⠀⠂⠀⢀⠀⠀⠀⠀⠀⠂⠐⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⠀⠀⢀⎥
⎢⠀⠀⠀⡀⠀⠀⠀⠀⠀⠀⠑⢄⠐⠔⠀⠀⠂⠁⠀⠀⠀⠂⠀⠀⠀⠀⣀⠀⠀⠀⠀⠆⠀⠂⠠⠤⠀⠆⠂⠄⎥
⎢⠀⠀⠉⠀⠀⠄⢀⠀⠁⠂⢐⠄⠑⢄⡈⠀⠄⠀⠀⠀⡄⠀⠀⠀⠈⠀⠀⠈⠀⡁⠀⠀⠀⠀⠀⠀⠙⠀⠀⠀⎥
⎢⠈⠀⠀⠈⠀⡀⠄⠈⠈⠀⠀⠀⠂⠈⠑⢄⠂⠀⠠⠀⠀⠀⠀⢄⠀⠀⠀⠀⠀⠀⢀⠀⠀⠀⠀⠈⠈⠀⠀⡀⎥
⎢⠀⠠⠀⠀⠂⠐⠠⠠⠀⠐⠌⠀⠀⠁⠈⠀⠑⢄⠀⠀⠀⠀⡀⠀⡀⠂⢀⠀⠀⠀⠀⠂⠄⠀⠀⠢⠀⠀⡀⠀⎥
⎢⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠂⠀⠀⠑⢄⠀⠀⠀⠀⠈⠀⠀⠀⠀⠀⠀⠀⠂⠀⢀⢀⠀⠀⠈⠈⎥
⎢⠠⠀⠀⠁⠀⠂⠁⠀⠀⠀⠠⠀⠀⠉⠀⠀⠀⠀⠀⠀⠑⢄⠀⠄⠀⠠⠀⠀⠀⠁⠀⡀⠀⠈⠀⠠⠀⢀⡀⠀⎥
⎢⠀⢈⠀⠀⠁⠀⠀⠀⢈⠀⠀⠀⠀⠀⠀⢄⠀⠈⠀⠀⠀⠄⠑⢄⢀⠀⡀⠀⠀⠀⠀⠀⠈⠀⠀⠐⠀⠀⠈⠀⎥
⎢⠀⠀⠐⡀⠀⠁⠀⠀⠀⠀⠀⠀⠂⠀⠀⠀⠠⠈⠂⠀⠀⡀⠀⠐⠑⢄⠁⠐⠀⠀⠀⠀⠀⠀⠀⠈⠀⠐⠀⠀⎥
⎢⡀⠀⠀⠄⠀⠀⠀⠀⠀⠀⠀⠘⡀⠀⠀⠀⠀⠐⠀⠀⠀⠀⠀⠈⢁⠀⠑⢄⠀⠠⠐⠀⣀⠀⢠⠁⠀⢀⠂⢀⎥
⎢⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠄⠠⠀⠀⠀⠀⠀⠀⠄⠀⠀⠀⠀⠀⠀⡀⢑⢔⠀⠀⠀⠀⠀⠠⠀⠀⠀⠀⎥
⎢⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠠⠄⠀⠀⠀⠐⠠⠀⠀⠀⠀⠠⠀⠀⠀⠀⠐⠀⠀⠀⠑⢄⠀⠀⠈⠁⠀⠀⠁⡀⎥
⎢⠀⠀⠀⡀⠀⠄⠈⢠⠀⠀⠠⠀⠀⠀⠀⠀⠀⠁⠈⠀⡀⠀⠂⠀⠀⠀⠀⠘⠀⠀⠀⠀⠑⢄⠀⠀⢀⠄⠆⠀⎥
⎢⠀⠐⠀⠀⠀⠀⠀⠀⠀⠀⠀⡆⠀⠀⡀⠀⠠⡀⠀⢐⠀⡀⢀⠀⡀⠀⠄⠒⠀⡀⠆⠀⠀⠀⢑⣴⠀⠀⠀⠄⎥
⎢⠀⠐⠀⠀⠀⠀⠀⡂⠀⠐⠠⠄⠓⠀⠂⠀⠀⠀⠀⠀⠀⢀⠀⠀⢀⠀⠀⢀⠀⠀⠀⠀⠀⠔⠀⠀⠑⢄⠀⠀⎥
⎣⠀⠀⠀⠁⠀⠂⢄⠀⠀⢀⠈⠄⠀⠀⠀⠠⠀⠈⡂⠀⠀⠈⠂⠀⠀⠀⠈⢀⠀⠀⠁⠠⠈⠁⠀⠄⠀⠀⠑⢄⎦

The sparse matrix contains 422 nonzero elements, which corresponds to a density of 2.7%:

In [ ]:
SparseArrays.nnz(A) / size(A, 1)^2
Out[0]:
0.027445369406867846

Computing speed

SCS is a first—order solver that does not use sparsity of PSD constraints. Let's run the solution to this problem and see how long it took.:

In [ ]:
set_optimizer(model, SCS.Optimizer)
@time optimize!(model)
------------------------------------------------------------------
	       SCS v3.2.10 - Splitting Conic Solver
	(c) Brendan O'Donoghue, Stanford University, 2012
------------------------------------------------------------------
problem:  variables n: 124, constraints m: 7750
cones: 	  s: psd vars: 7750, ssize: 1
settings: eps_abs: 1.0e-04, eps_rel: 1.0e-04, eps_infeas: 1.0e-07
	  alpha: 1.50, scale: 1.00e-01, adaptive_scale: 1
	  max_iters: 100000, normalize: 1, rho_x: 1.00e-06
	  acceleration_lookback: 10, acceleration_interval: 10
	  compiled with openmp parallelization enabled
lin-sys:  sparse-direct-amd-qdldl
	  nnz(A): 124, nnz(P): 0
------------------------------------------------------------------
 iter | pri res | dua res |   gap   |   obj   |  scale  | time (s)
------------------------------------------------------------------
     0| 9.71e-01  3.87e+00  3.12e+02  2.65e+02  1.00e-01  9.67e-02 
   250| 1.20e-03  3.27e-04  2.64e-01  1.42e+02  2.92e-02  1.16e+00 
   500| 2.97e-04  7.52e-05  4.58e-02  1.42e+02  2.92e-02  2.20e+00 
   750| 2.40e-04  6.14e-05  3.89e-02  1.42e+02  2.92e-02  3.37e+00 
  1000| 1.91e-04  5.10e-05  3.30e-02  1.42e+02  2.92e-02  4.36e+00 
  1250| 1.55e-04  4.34e-05  2.78e-02  1.42e+02  2.92e-02  5.43e+00 
  1500| 1.34e-04  3.65e-05  2.34e-02  1.42e+02  2.92e-02  6.46e+00 
  1750| 2.19e-01  2.84e-01  3.74e+00  1.44e+02  2.92e-02  7.56e+00 
  2000| 1.01e-04  2.57e-05  1.67e-02  1.42e+02  2.92e-02  8.60e+00 
  2250| 9.41e-05  2.15e-05  1.42e-02  1.42e+02  2.92e-02  9.67e+00 
------------------------------------------------------------------
status:  solved
timings: total: 9.67e+00s = setup: 1.89e-02s + solve: 9.65e+00s
	 lin-sys: 4.98e-01s, cones: 8.36e+00s, accel: 3.76e-02s
------------------------------------------------------------------
objective = 141.972713
------------------------------------------------------------------
 11.288593 seconds (1.50 M allocations: 75.277 MiB, 13.45% compilation time)

We use SCS.Optimizer as an argument to MathOptChordalDecomposition.Optimizer. Let's launch the solution and measure the solution time.

In [ ]:
set_optimizer(model, () -> MathOptChordalDecomposition.Optimizer(SCS.Optimizer))
@time optimize!(model)
------------------------------------------------------------------
	       SCS v3.2.10 - Splitting Conic Solver
	(c) Brendan O'Donoghue, Stanford University, 2012
------------------------------------------------------------------
problem:  variables n: 1155, constraints m: 8781
cones: 	  z: primal zero / dual free vars: 7750
	  s: psd vars: 1031, ssize: 115
settings: eps_abs: 1.0e-04, eps_rel: 1.0e-04, eps_infeas: 1.0e-07
	  alpha: 1.50, scale: 1.00e-01, adaptive_scale: 1
	  max_iters: 100000, normalize: 1, rho_x: 1.00e-06
	  acceleration_lookback: 10, acceleration_interval: 10
	  compiled with openmp parallelization enabled
lin-sys:  sparse-direct-amd-qdldl
	  nnz(A): 2186, nnz(P): 0
------------------------------------------------------------------
 iter | pri res | dua res |   gap   |   obj   |  scale  | time (s)
------------------------------------------------------------------
     0| 2.75e+01  1.00e+00  9.93e+03 -4.81e+03  1.00e-01  1.24e-02 
   250| 6.74e-03  1.47e-03  6.73e-03  1.42e+02  1.73e+00  9.51e-01 
   350| 1.23e-04  7.46e-05  4.30e-05  1.42e+02  1.73e+00  1.34e+00 
------------------------------------------------------------------
status:  solved
timings: total: 1.34e+00s = setup: 7.11e-03s + solve: 1.33e+00s
	 lin-sys: 2.87e-01s, cones: 9.09e-01s, accel: 5.99e-03s
------------------------------------------------------------------
objective = 141.990174
------------------------------------------------------------------
  7.378757 seconds (8.81 M allocations: 443.399 MiB, 1.60% gc time, 81.60% compilation time)

Now the decision time took less than one second. This difference in performance is due to the use of chordal decomposition. The decomposed task introduced new variables (now there are 1155 instead of 124) and constraints (now there are 115 PSD constraints instead of one), but each PSD constraint has become smaller than the original one.

In [ ]:
decomposition = unsafe backend(model)
Out[0]:
MathOptChordalDecomposition.Optimizer{CliqueTrees.MF}
├ ObjectiveSense: MIN_SENSE
├ ObjectiveFunctionType: MOI.ScalarAffineFunction{Float64}
├ NumberOfVariables: 1155
└ NumberOfConstraints: 116
  ├ MOI.VectorAffineFunction{Float64} in MOI.Zeros: 1
  └ MOI.VectorOfVariables in MOI.PositiveSemidefiniteConeTriangle: 115

Let's calculate the number of PSD constraints of each size:

In [ ]:
quantity = Dict{Int,Int}()
for ci in MOI.get(decomposition, MOI.ListOfConstraintIndices{MOI.VectorOfVariables,S}())
    set = MOI.get(decomposition, MOI.ConstraintSet(), ci)
    n = set.side_dimension
    quantity[n] = get(quantity, n, 0) + 1
end
quantity
Out[0]:
Dict{Int64, Int64} with 10 entries:
  5  => 7
  4  => 15
  6  => 3
  7  => 3
  2  => 33
  10 => 2
  9  => 2
  8  => 3
  3  => 35
  1  => 12

The largest PSD constraint now has a size of 10, which is significantly smaller than the original 124x124 matrix.

Conclusion

The study demonstrates the effectiveness of chordal decomposition as a method for accelerating the solution of semi-definite programming problems with sparse matrix constraints.

The prospects for further research include adapting the method to problems with a mixed structure (a combination of PSD constraints with other types of constraints), optimizing algorithms for constructing chordal extensions for special classes of sparse matrices, as well as developing criteria for evaluating the feasibility of decomposition based on an analysis of the problem structure.

The implementation of the method ensures reproducibility of the results and creates the basis for integrating chordal decomposition into the work processes of researchers and engineers working with large-scale optimization tasks.