Changing Ipopt options

I have a nonlinear program which Ipopt finds the optimal solution for it, with adding some changes into the problem the solver can not find the optimal solution anymore and iterates till the maximum number of iterations.

Is there any option in Ipopt which I can change and tune for solving the problem?

There is a whole bunch of ipopt options that you could change (Ipopt: Ipopt Options) from JuMP (Models · JuMP). It is difficult to say which options would be useful without knowing the problem.

Some ideas:

  • You can always start with increasing the number of iterations - maybe ipopt just needs more iterations to find a solution.
  • You can also look at detailed output from ipopt changing the print_level option - maybe that would give some guidance. You could for instance look at what is happening to your objective - does it keep changing or does it get stuck somewhere?
  • You can also start with giving ipopt a different initial guess (Variables · JuMP) - in nonlinear optimisation a good starting point can help a lot.

At the same time - what do the changes do? Do you add more constraints? Are you sure that your problem has a solution after the changes are introduced?

1 Like

Thanks for your attention
The objective seems to be “oscillating” …

The changes turn a linear constraint into a nonlinear one , I also add a regularization term to the objective function to perform a feature selection .

Does Ipopt have different methods or something like that among the options ?

Does Ipopt have different methods or something like that among the options ?

Not really. Ipopt assumes problems are smooth and twice-differentiable. If your problem violates those assumptions things still generally work, but sometimes it won’t converge (try minimizing abs(x) starting at x=1, for example).

Coincidentally your example is telling me what the problem is!
The term which I add to my objective function contain abs …

Do you think changing the solver can help me?

I tried to use MadNLP :

Do we have gradient-based solvers for NLP in Julia ?

You can reformulate your problem into a smooth problem.
Replace \min_{x, y} f(x) + |y| with \min_{x, y, a} f(x) + a subject to a \ge y, a \ge -y.
Keep Ipopt :wink:

2 Likes

Thank you @cvanaret for your suggestion
I’m not sure if I can implement that in my NLP as the term I’ve mentioned is L1 norm of a vector .

Actually I’m trying to perform feature selection with “lasso” method and the abs come from here.

Sure, that works too: use my suggestion componentwise on min_{x,y} f(x) + ||y||_1 = min_{x,y} f(x) + \sum_{i=1}^n |y_i|.

4 Likes

Thanks a lot @cvanaret :smiley: :smiley: :smiley: :pray:

2 Likes