JuMP Mosek sparsity scaling

I’m trying to solve an SDP problem on the form

\begin{align} \min \hspace{0.5cm} & c^\intercal x \\ \text{s.t.} \hspace{0.5cm} & A_0 + \sum A_i x_i \succeq 0 \end{align}

with x being of size ~370 and the A matrices being sparse, using JuMP and Mosek. However, after about 20 minutes I get a Mosek.MosekError(1051, "Out of space."). Windows task manager also shows my RAM being maxed out (I have 16 GB) while it’s running. I tried doing this for a simpler problem:

using JuMP
using MosekTools
using LinearAlgebra

n = 150
model = Model(Mosek.Optimizer)
    
@variables model begin
    P[1:n]
end

s = sum(P)
M = P |> Diagonal |> Matrix

@objective model Min s
@SDconstraint model M >= Matrix(I,n,n)

@time optimize!(model)
@show termination_status(model)

which is a very simple problem with the obvious solution that every decision variable is = 1. This is solvable until about n > 255, at which point I again get the memory error. The time taken to solve it also scales very quickly with n. I should note that using sparse from the SparseMatrices package does nothing. On the other hand, I also tried using Mosek directly to solve the dual of the simple problem:

using Mosek
using Printf, SparseArrays, LinearAlgebra

printstream(msg::String) = print(msg)
n = 20
bkc = [ MSK_BK_FX for k in 1:n]

blc = [1.0 for k in 1:n]
buc = [1.0 for k in 1:n]

barci = [k for k in 1:n]
barcj = [k for k in 1:n]
barcval = [-1.0 for k in 1:n]

barai   = Any[[k] for k in 1:n]
baraj   = Any[[k] for k in 1:n]
baraval = Any[[1.0] for k in 1:n]

numcon = length(bkc)
barvardim = [n]
barvardims = [n for k in 1:n] |> Vector
nz = [1 for k in 1:n]

t = maketask() do task
    putstreamfunc(task,MSK_STREAM_LOG,printstream)

    appendcons(task,numcon)
    appendbarvars(task,barvardim)
    putconboundslice(task,1,numcon+1, bkc,blc,buc)

    symc  = appendsparsesymmat(task,barvardim[1],
                               barci,
                               barcj,
                               barcval)
    idxs = []
    for i in 1:n
        idx = appendsparsesymmat(task,barvardim[1],
                                    barai[i],
                                    baraj[i],
                                    baraval[i])
        push!(idxs, idx)
    end
    putbarcj(task,1, [symc], [1.0])

    for k in 1:n
        putbaraij(task, k, 1, [idxs[k]], [1.0])
    end

    putobjsense(task,MSK_OBJECTIVE_SENSE_MINIMIZE)

    optimize(task)
    solutionsummary(task,MSK_STREAM_MSG)
end

which solves the problem very quickly even for large n. Given that in Mosek’s output when using JuMP, it lists n(n+1)/2 constraints (most of which are 0 = 0), it seems to me as if JuMP isn’t exploiting sparsity (or that information gets lost somewhere between JuMP and Mosek). Does anyone know what is going on here? Why is JuMP and Mosek taking so long and using so much memory to solve this simple problem? Can I do anything about this? I would prefer to avoid needing to construct the dual of my problem and solving that with Mosek directly.

It might be the use of Matrix, converting both sides of the constraint into dense arrays?

I don’t think that’s the case. Converting both M and the identity matrix in the constraint into sparse matrices with SparseArrays doesn’t help: it still lists n(n+1)/2 constraints instead of n.

Solving this problem with Mosek in Matlab/Yalmip, even without explicitly using sparse arrays, lists n constraints, so I’m fairly certain there’s something either in how I’m setting it up in JuMP or how JuMP handles sparsity.

I don’t have a Mosek license so I can’t test. But the issue is likely that the following is not sparse:

julia> Matrix(Diagonal(P)) - Matrix(I, 3, 3)
3×3 Matrix{AffExpr}:
 P[1] - 1  0         0
 0         P[2] - 1  0
 0         0         P[3] - 1

and Mosek needs variables specified as PSD when they are created, so we add a new PSD matrix, and then add all constraints n(n+1)/2 constraints to the problem. (The 0=0 are added as linear constraints, not variable bounds.)

Perhaps try something like

using JuMP
using MosekTools
n = 150
model = Model(Mosek.Optimizer)
@variables(model, begin
    P[1:n]
    M[1:n, 1:n], PSD
end)
for i = 1:n, j = 1:n
    if i == j
        @constraint(model, M[i, i] == P[i] - 1)
    else
        fix(M[i, j], 0.0)
    end
end
@objective(model, Min, sum(P))
@time optimize!(model)
@show termination_status(model)
1 Like

This throws the following

Cannot add MathOptInterface.EqualTo{Float64} constraint on a matrix variable

Stacktrace:
 [1] error(::String) at .\error.jl:33
 [2] add_constraint(::MosekModel, ::MathOptInterface.SingleVariable, ::MathOptInterface.EqualTo{Float64}) at C:\Users\Philip\.julia\packages\MosekTools\sppJY\src\constraint.jl:409
 [3] _broadcast_getindex_evalf at .\broadcast.jl:648 [inlined]
 [4] _broadcast_getindex at .\broadcast.jl:621 [inlined]
 [5] getindex at .\broadcast.jl:575 [inlined]
 [6] macro expansion at .\broadcast.jl:932 [inlined]
 [7] macro expansion at .\simdloop.jl:77 [inlined]
 [8] copyto! at .\broadcast.jl:931 [inlined]
 [9] copyto! at .\broadcast.jl:886 [inlined]
 [10] copy at .\broadcast.jl:862 [inlined]
 [11] materialize at .\broadcast.jl:837 [inlined]
 [12] add_constraints(::MosekModel, ::Array{MathOptInterface.SingleVariable,1}, ::Array{MathOptInterface.EqualTo{Float64},1}) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\constraints.jl:175
 [13] add_constraints(::MathOptInterface.Bridges.LazyBridgeOptimizer{MosekModel}, ::Array{MathOptInterface.SingleVariable,1}, ::Array{MathOptInterface.EqualTo{Float64},1}) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Bridges\bridge_optimizer.jl:1133
 [14] copy_constraints(::MathOptInterface.Bridges.LazyBridgeOptimizer{MosekModel}, ::MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}, ::MathOptInterface.Utilities.IndexMap, ::Array{MathOptInterface.ConstraintIndex{MathOptInterface.SingleVariable,MathOptInterface.EqualTo{Float64}},1}, ::Nothing) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\copy.jl:295
 [15] copy_constraints(::MathOptInterface.Bridges.LazyBridgeOptimizer{MosekModel}, ::MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}, ::MathOptInterface.Utilities.IndexMap, ::Array{MathOptInterface.ConstraintIndex{MathOptInterface.SingleVariable,MathOptInterface.EqualTo{Float64}},1}) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\copy.jl:285
 [16] pass_constraints(::MathOptInterface.Bridges.LazyBridgeOptimizer{MosekModel}, ::MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}, ::Bool, ::MathOptInterface.Utilities.IndexMap, ::Array{Type{var"#s313"} where var"#s313"<:MathOptInterface.AbstractScalarSet,1}, ::Array{Array{var"#s291",1} where var"#s291"<:(MathOptInterface.ConstraintIndex{MathOptInterface.SingleVariable,S} where S),1}, ::Array{Type{var"#s320"} where var"#s320"<:MathOptInterface.AbstractVectorSet,1}, ::Array{Array{var"#s315",1} where var"#s315"<:(MathOptInterface.ConstraintIndex{MathOptInterface.VectorOfVariables,S} where S),1}, ::typeof(MathOptInterface.Utilities.copy_constraints), ::typeof(MathOptInterface.set); filter_constraints::Nothing) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\copy.jl:317
 [17] default_copy_to(::MathOptInterface.Bridges.LazyBridgeOptimizer{MosekModel}, ::MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}, ::Bool, ::Nothing) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\copy.jl:474
 [18] #automatic_copy_to#137 at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\copy.jl:21 [inlined]
 [19] #copy_to#3 at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Bridges\bridge_optimizer.jl:293 [inlined]
 [20] attach_optimizer(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\cachingoptimizer.jl:150
 [21] optimize!(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}) at C:\Users\Philip\.julia\packages\MathOptInterface\ZJFKw\src\Utilities\cachingoptimizer.jl:211
 [22] optimize!(::Model, ::Nothing; bridge_constraints::Bool, ignore_optimize_hook::Bool, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at C:\Users\Philip\.julia\packages\JuMP\y5vgk\src\optimizer_interface.jl:139
 [23] optimize! at C:\Users\Philip\.julia\packages\JuMP\y5vgk\src\optimizer_interface.jl:115 [inlined] (repeats 2 times)
 [24] top-level scope at .\timing.jl:174 [inlined]
 [25] top-level scope at .\In[6]:0
 [26] include_string(::Function, ::Module, ::String, ::String) at .\loading.jl:1091

Guessing it doesn’t like

@constraint(model, M[i, i] == P[i] - 1)

for whatever reason.

Ah, no. It’s the bound fix(M[i, j], 0). Just try removing that line.

Here’s the example with SCS

using JuMP, SCS
n = 150
model = Model(SCS.Optimizer)
@variable(model, P[1:n])
@variable(model, M[1:n, 1:n], PSD)
@constraint(model, [i=1:n], M[i, i] == P[i] - 1)
@objective(model, Min, sum(P))
@time optimize!(model)
@show termination_status(model)

This works. Thanks for the help!

1 Like