Hyperparameters to tune/try when getting ITERATION_LIMIT, NUMERICAL_ERROR for nonlinear optimization

Generally what hyperparameters should one try when getting the above two exit codes? I tried increasing max_iter to 10,000 which noticeably slowed down the optimization but I’m still getting ITERATION_LIMIT. One thing I’m not 100% sure is whether Ipopt is by-default running a first-order or second-order gradient method?

Note: I’m running optmiization multiple times and based on different initialization/objective parameter values, get either ITERATION_LIMIT or NUMERICAL_ERROR.

Hoping there are some generic solutions to try before providing more details about my specific use-case. Code:

model = Model(Ipopt.Optimizer)
  
tolerance = 1e-7
@variable(model, 1-tolerance>= x[1:4] >= 0+tolerance)
x0 = rand(.......
set_start_value.(x, x0)
  
@constraint(model, ........)
  
# my_objective(x...) = .........
  
register(model, :mobj, length(x), my_objective; autodiff = true)
@NLobjective(model, Min, mobj(x...))
  
optimize!(model)

Answering my own question. I solved this by using NLopt (with LD_SLSQP) instead of Ipopt. It seems to work much better out of the box which is better for noobies. It also allowed me to remove the tolerance hack.

model = Model(NLopt.Optimizer)
set_optimizer_attribute(model, "algorithm", :LD_SLSQP)
  
@variable(model, 1>= x[1:4] >= 0)
x0 = rand(.......
set_start_value.(x, x0)
  
@constraint(model, ........)
  
# my_objective(x...) = .........
  
register(model, :mobj, length(x), my_objective; autodiff = true)
@NLobjective(model, Min, mobj(x...))
  
JuMP.optimize!(model)

If your model is twice-differentiable, Ipopt will use exact Hessians. Edit! See post below

Since you left out the definition of your functions, we can’t say why Ipopt is struggling. Good that it worked with SLSQP anyway!

1 Like

Just one small correction: Ipopt won’t use exact hessians if you pass a multi-variate user-defined function: https://github.com/jump-dev/JuMP.jl/issues/1198.

1 Like