Syntax for specifying block diagonal constraints in Convex

Hi community, I am having issues inputting the following problem into Convex.jl

Screen Shot 2022-07-17 at 4.03.13 PM

where \Sigma and S_1,...,S_m are positive definite matrices of appropriate dimensions. The syntax diag(x_1S_1,...,x_mS_m) in the constraint specifies a block-diagonal matrix.

My code so far:

using LinearAlgebra
using Convex
using Hypatia
using BlockDiagonals

function solve_SDP(Σ::Matrix{Float64}, S::Vector{Matrix{Float64}})
    x = Variable(length(S))
    add_constraint!(x, x ≥ 0)
    add_constraint!(x, x ≤ 1)
    current_blocks = BlockDiagonal(x .* S)
    constraint = Symmetric(2Σ - current_blocks) in :SDP
    problem = maximize(sum(x), constraint)
    solve!(problem, Hypatia.Optimizer)
    return evaluate(x)
end

Σ = 0.5 * Matrix(I, 100, 100) + 0.5 * ones(100, 100)
S = [0.5 * Matrix(I, 10, 10) + 0.5 * ones(10, 10) for _ in 1:10]
solve_SDP(Σ, S)

throws this error

MethodError: no method matching ispos(::Vector{Matrix{Float64}})
Closest candidates are:
  ispos(::Real) at /home/users/bbchu/.julia/packages/Convex/FQF1R/src/constant.jl:6
  ispos(::AbstractVecOrMat{var"#s52"} where var"#s52"<:Real) at /home/users/bbchu/.julia/packages/Convex/FQF1R/src/constant.jl:7

I get the same error if I convert current_blocks to a Matrix{Float64}.

Any help is appreciated.

What’s the full error? You typically can’t drop an arbitrary matrix type into a Convex model.

What if you use current_blocks = Matrix(BlockDiagonal(x .* S))?

Here’s the full stacktract when I do
current_blocks = Matrix(BlockDiagonal(x .* S))

julia> solve_SDP(Σ, S)
ERROR: MethodError: no method matching ispos(::Vector{Matrix{Float64}})
Closest candidates are:
  ispos(::Real) at ~/.julia/packages/Convex/FQF1R/src/constant.jl:6
  ispos(::AbstractVecOrMat{<:Real}) at ~/.julia/packages/Convex/FQF1R/src/constant.jl:7
Stacktrace:
 [1] _sign(x::Vector{Matrix{Float64}})
   @ Convex ~/.julia/packages/Convex/FQF1R/src/constant.jl:17
 [2] Constant(x::Vector{Matrix{Float64}}, check_sign::Bool) (repeats 2 times)
   @ Convex ~/.julia/packages/Convex/FQF1R/src/constant.jl:37
 [3] broadcasted(#unused#::typeof(*), x::Variable, y::Vector{Matrix{Float64}})
   @ Convex ~/.julia/packages/Convex/FQF1R/src/atoms/affine/multiply_divide.jl:230
 [4] solve_SDP(Σ::Matrix{Float64}, S::Vector{Matrix{Float64}})
   @ Main ./REPL[5]:5
 [5] top-level scope
   @ REPL[8]:1

I also just tried scaling each block one-by-one, which gave a different error

function solve_SDP(Σ::Matrix{Float64}, S::Vector{Matrix{Float64}})
    x = Variable(length(S))
    add_constraint!(x, x ≥ 0)
    add_constraint!(x, x ≤ 1)
    for i in 1:length(S)
        S[i] .*= x[i] # scale block S_i by x_i
    end
    current_blocks = Matrix(BlockDiagonal(S))
    constraint = Symmetric(2Σ - current_blocks) in :SDP
    problem = maximize(sum(x), constraint)
    solve!(problem, Hypatia.Optimizer)
    return evaluate(x)
end

Σ = 0.5 * Matrix(I, 100, 100) + 0.5 * ones(100, 100)
S = [0.5 * Matrix(I, 10, 10) + 0.5 * ones(10, 10) for _ in 1:10]
solve_SDP(Σ, S)

ERROR: MethodError: no method matching ndims(::Type{Convex.DotMultiplyAtom})
Closest candidates are:
  ndims(::Base.Iterators.ProductIterator) at /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/base/iterators.jl:983
  ndims(::Number) at /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/base/number.jl:85
  ndims(::UniformScaling) at /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/stdlib/v1.7/LinearAlgebra/src/uniformscaling.jl:87
  ...
Stacktrace:
 [1] Base.Broadcast.BroadcastStyle(#unused#::Type{Convex.DotMultiplyAtom})
   @ Base.Broadcast ./broadcast.jl:103
 [2] combine_styles(c::Convex.DotMultiplyAtom)
   @ Base.Broadcast ./broadcast.jl:435
 [3] Base.Broadcast.Broadcasted(f::typeof(identity), args::Tuple{Convex.DotMultiplyAtom}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}})
   @ Base.Broadcast ./broadcast.jl:175
 [4] materialize!
   @ ./broadcast.jl:864 [inlined]
 [5] solve_SDP(Σ::Matrix{Float64}, S::Vector{Matrix{Float64}})
   @ Main ./REPL[5]:7
 [6] top-level scope
   @ REPL[8]:1

It looks like there’s no good way to achieve this in Convex.jl. It uses an expression-graph representation of the problem, so it seems like it has some problems with using an arbitrary matrix type like BlockDiagonal.

Here’s how I’d write it in JuMP:

using JuMP, LinearAlgebra, BlockDiagonals, Hypatia
m, n = 4, 2
model = Model(Hypatia.Optimizer)
Σ = 0.5 * Matrix(I, m, m) + 0.5 * ones(m, m)
S = [0.5 * Matrix(I, n, n) + 0.5 * ones(n, n) for _ in 1:n]
@variable(model, 0 <= x[1:n] <= 1)
blocks = Matrix(BlockDiagonal([x[i] * S[i] for i in 1:n]))
@constraint(model, Symmetric(2 * Σ - blocks) in PSDCone())
1 Like

Ah, thank you very much. That totally works for my purposes.

I think I tried JuMP a while ago but somehow decided to stick with Convex.jl. I suppose I’m forced to use JuMP now, but maybe that’s a good thing

1 Like