I have been using JuMP for the last 2 years. I am now rather comfortable with the syntax and I have successfully solved some very complex models. For these reasons, JuMP is my preferred modeling environment. I often solve models with nonconvex terms.

For one specific problem, I have a base mixed integer model from a collabrator implemented in JuMP. This model has been sufficiently debugged and we have a nice workflow for solution visualization.

I am trying to solve a variant of this model where the objective in Log-Sum-Exponential[1]. All of the constraints are linear and some variables are integer. I have tried a few formulations for the objective, but I always run into numerical trouble. For example, using Parajeto.jl (as an Outer Approximation MINLP solver) with Gurobi and Ipopt, I often encounter NLP solver failures after a few iterations.

Is there a better way? Probably, given that Convex.jl has a built-in expression for log-sum-exponential. But for the reasons mentioned above, I would not like to manually translate my linear JuMP model to use Convex. Is there a way I can mix the two modeling environments?

[1] Actually the objective (min z) is of the form

z >= y f(x/y), y > 0

where y, z are scalar and x is vector valued. f( ) is the log-sum-exponential function. I am just starting to immerse myself in convex optimization literature. I think this can be formulated as a cone because f( ) is a convex function.