Help speeding up problem with Convex.jl

I’m currently solving a convex problem with Convex.jl and COSMO.jl. The text output from COSMO indicates that the solution is actually quite fast (less than a second) but it takes ca. 30 seconds to set the problem up with all constraints and the cost function.

All constraints are equalities which each link 3 components of a 1000-component Variable together (i.e. constraining 3D subvectors of the Variable onto various planes).

I’m currently adding the constraints individually with add_constraint! — am I doing something wrong or could I do this in a way which is easier for Convex.jl to set up? Should I switch away from COSMO.jl?

ping @ericphanson @migarstka

It’s hard to offer advice without a reproducible example. Can you share the code?

No, it’s much to complicated…

I suspect what is taking so much time is extracting the Hessian from the objective function.

I don’t think we can offer advice without the code. There could be any number of reasons.

Have you tried JuMP?

Convex works best with vector/matrix constraints; scalar ones are slower for problem formulation (since the internals are not type stable). JuMP works on a scalar level so it could be a better fit as @odow says. The other main difference between the two is that Convex does automatic reformulations and checks that the problem is indeed convex, but you can do those reformulations by hand in JuMP if you need them. (Another difference is Convex supports high precision numeric types but that doesn’t seem relevant here).

2 Likes

I suspect that if you give them all at once with a matrix-vector formulation it might be much faster

I couldnt find anything about that in the Convex.jl documentation—how do I add constraints in matrix form?

For example, the constraints in Entropy Maximization · Convex.jl

1 Like

Thanks, I managed to make that work nicely with a sparse constraint matrix.

The program still spends 12–13 seconds on solve!() while COSMO reports less than 1 second spent solving the problem.

Is there anyway to find out/optimize what Convex.jl is doing in those missing 10 seconds?

Good point, so the source of the delay appears to be this sorting operation in MathOptInterface.jl

https://github.com/jump-dev/MathOptInterface.jl/blob/f559840765587598255e647dc144db8b74566d21/src/Utilities/functions.jl#L764

I wonder if it would be possible to skip this step (by pre-sorting or something)

Ok, so as far as I can tell, when I call solve! my constraints pass through a Rube-Goldberg-machine inside MathOptInterface.jl at the end of which sort_and_compress! is called.

In the middle a method of add_constraint() which is located inside a file called universalfallback.jl is called. So I suspect the reason it takes so long, is some sort of slow fallback algorithm.

A different way to ask is: Are there any Optimizers for Convex.jl which don’t rely on the universalfallback implementation?

It appears that I can avoid the sorting procedure if I arrange my constraints in the right order:

https://github.com/jump-dev/MathOptInterface.jl/blob/ff1c6d3d9c0b83335f74be0ca04beae56ac13d7d/src/Utilities/functions.jl#L776

Is there any documentation about the ideal order of constraints?

The internals of MOI are complicated, which makes it hard to debug or offer suggestions without a reproducible example. Can you not simplify your problem somehow? Hard code the data or read it from a JSON file?

Sorting isn’t related to the order of constraints, but the terms within an affine or quadratic constraint.

You’ve hit the nail on the head: The objective function I want to find an optimal solution for, is quite complicated (almost 100 lines of code, without the constraints). In addition a lot of input data. I’ve tried to create a MWE — but when I simplify the objective function, the delay also vanishes.

Your answer explains that. Apparently, the 10 seconds are spent on processing the call graph of the objective.

It seems the only feasable solution is deriving the Hessian by hand and inputting it directly into COSMO.jl. I thought I could avoid that somehow with Convex.jl but waiting 10 seconds to perform a 1 second optimization is not feasable.

Try JuMP instead. Convex has to do a lot of processing around the objective to prove convexity etc.

1 Like

That’s quite frustrating… I specifically chose Convex.jl because I know the problem I’m solving is a convex optimization. If I have to reimplement, then it makes more sense to me to do it with COSMO — because then I know I will get the required performance.

If I now re-implemented in JuMP then who knows if it will be faster, or slower, or if it will work at all.