Is optimization parallel?

Hello, I’m completely new to Julia and wanted to code a simple optimization problem.

For a given sparse matrix A (n,m), find sparse matrix X (m,m), minimizing the following objective: L = 1/2 ||A-AX||^2_F + beta/2 ||X||^2_F + lambda ||X||_1 . With the restriction diag(X)=0.

I figured that I can use either Jump or Convex for this problem. But the question is that this optimization could be parallelized by columns of X and A. Is this done internally and I should just optimize the whole matrix or should I parallelize it somehow myself?

Also how do I choose a solver?

Just model it using JuMP or Convex and throw it at a solver.

SCS is a good option to start with.

1 Like

To add to Oscar’s answer, both Convex and JuMP just provide ways to formulate the problem for solvers; neither does so in parallel, but many solvers internally use parallelism to actually solve the problem. The JuMP docs maintain a nice list of solvers; all those solvers are generally also compatible with Convex (since both JuMP and Convex lower to the same intermediate layer, MathOptInterface).

ConvexTests can also provide some guidance, by showing the results of running several solvers through test problems provided by Convex.jl and SumOfSquares.jl, although one should not pay too much attention to the timing results, since it is run on GitHub Actions, not a proper benchmarking environment, and the problems are not made for benchmarking. One should also take any test failures with a grain of salt, since possibly choosing different solver parameters would yield correct results, but they might be good to be aware of.

Folks have reported memory issues with Convex’s problem formulation for large sparse matrices; one could also try CVXPY (which is pretty easy to use via PyCall.jl or PythonCall.jl).

1 Like

Something like this should get you started:

using JuMP, SCS
function main(A::AbstractMatrix, β, λ)
    m = size(A, 2)
    model = Model(SCS.Optimizer)
    @variable(model, X[1:m, 1:m])
    # diag(X) == 0
    @constraint(model, [i=1:m], X[i, i] == 0)
    # ||A - AX||^2_F
    @variable(model, t1 >= 0)
    @expression(model, y, A  .- A * X)
    @constraint(model, [t1; 0.5; vec(y)] in RotatedSecondOrderCone())
    # ||X||^2_F
    @variable(model, t2 >= 0)
    @constraint(model, [t2; 0.5; vec(X)] in RotatedSecondOrderCone())
    # ||X||_1
    @variable(model, t3 >= 0)
    @constraint(model, [t3; vec(X)] in MOI.NormOneCone(length(X) + 1))
    # Objective
    @objective(model, Min, 0.5 * t1 + β / 2 * t2 + λ * t3)
    optimize!(model)
    return value.(X)
end
4 Likes