Optimization taking too long

Hi everyone,

I am solving an optimization problem with a lot of variables through JuMP with Ipopt solver. The convergence time is extremely sensitive to one of my constraints:

for example if I use a linear functional form in that constraint the convergence takes place in 3 seconds
if I use a square root functional form it takes 260 seconds

Does anyone have any experience with other solvers that might improve convergence time with functions like square root?

Any help would be very much appreciated!
Thanks,
Miguel.

The derivative of a square root blows up; in general you want to do optimization on functions whose first and second derivatives are bounded (otherwise convergence may be very slow or fail completely, depending on the algorithm and the problem).

1 Like

Thanks for the answer but I also use the following function:

e^(a+bx)/(1+e^(a+bx))

in which case its first and second derivative does not blow up and it is equally slow so I am guessing that the non linearity is slowing it down for other reason. Maybe someone has had experience with non linear solver which had proven to be fast for them.

We need more details to say anything sensible. How many dimensions? What kind of constraints? Smooth? How are you computing derivatives? How expensive is the objective function to evaluate?

1 Like

I understand the information is vague. My question is whether people have a common prior on what solvers in Julia generally work better with these kind on non-linear constraints.

It’s hard to offer advice without a reproducible example. Too often, the answer is “it depends.”