Mixing models between JuMP.jl and Convex.jl

I have been using JuMP for the last 2 years. I am now rather comfortable with the syntax and I have successfully solved some very complex models. For these reasons, JuMP is my preferred modeling environment. I often solve models with nonconvex terms.

For one specific problem, I have a base mixed integer model from a collabrator implemented in JuMP. This model has been sufficiently debugged and we have a nice workflow for solution visualization.

I am trying to solve a variant of this model where the objective in Log-Sum-Exponential[1]. All of the constraints are linear and some variables are integer. I have tried a few formulations for the objective, but I always run into numerical trouble. For example, using Parajeto.jl (as an Outer Approximation MINLP solver) with Gurobi and Ipopt, I often encounter NLP solver failures after a few iterations.

Is there a better way? Probably, given that Convex.jl has a built-in expression for log-sum-exponential. But for the reasons mentioned above, I would not like to manually translate my linear JuMP model to use Convex. Is there a way I can mix the two modeling environments?

[1] Actually the objective (min z) is of the form

z >= y f(x/y), y > 0

where y, z are scalar and x is vector valued. f( ) is the log-sum-exponential function. I am just starting to immerse myself in convex optimization literature. I think this can be formulated as a cone because f( ) is a convex function.

Yes, that term can be encoded using exponential cones (although I’m not sure if Convex.jl supports the perspective operation of logsumexp), and Pajarito will be the best way I’m aware of to solve mixed-integer convex problems with exponential cones.

Off the top of my head, if you have z >= y * logsumpexp(x/y), divide by y, take exp of both sides, then divide by exp(z/y), then multiply by y again you’ll get
y >= y*sum(exp.(x/y-z/y)) which can be expressed with exponential cones by introducing an auxiliary variable for each exp term.

Not really, besides extracting the full constraint matrix from JuMP and using it in an A*x == b type constraint in Convex.jl. At this point you could also generate the raw conic-form problem and call Pajarito directly. We’re planning on adding support for modeling exponential cones in JuMP over the summer. It will be close to the low-level format without the sort of automatic transformations which Convex.jl can perform.

Thanks, Miles. Reach out to me once you’ve added the exponential cone support to JuMP. I am happy to beta test.

Adding some additional variable bounds fixed the issues with restoration failure in Ipopt. I can now solve the problem using Pajarito (as a MINLP solver).