I am with code:
model = Model(with_optimizer(Ipopt.Optimizer))
@variable(model, x, start = 500)
@variable(model, y, start = 500)
@NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)
println("x = ", value(x), " y = ", value(y))
an I want to give a maximum CPU time to solver Ipopt and a limited number of iterations. How can I do this? Where can I set the parameters in Ipopt? Anyone could help me please?
See the Ipopt parameters here: https://www.coin-or.org/Ipopt/documentation/node40.html
You’re looking for
I’ve made a PR updating the documentation: https://github.com/JuliaOpt/Ipopt.jl/pull/173
Thank you so much for help!
How can I get the solve time next the optimization?
I have used const MOI = JuMP.MathOptInterface with solver Ipopt and tried
MOI.get(model, MOI.SolveTime()) and gives the error: ArgumentError: ModelLike of type Ipopt.Optimizer does not support accessing the attribute MathOptInterface.SolveTime()
getsolvetime(modelo) and give UndefVarError: getsolvetime not defined
and nothing happens.
Could you help me please?
I believe that there is no way to get that information because IPOPT does not export it. What we did in NLPModelsIpopt.jl is to run IPOPT, capture the output, and then parse the result.
using NLPModels, NLPModelsIpopt
f(x) = (x - 1)^2 + 100 * (x - x^2)^2
nlp = ADNLPModel(f, [500.0; 500.0])
output = ipopt(nlp)
For what it’s worth, https://github.com/JuliaOpt/Ipopt.jl/pull/170 implements the
SolveTime() attribute in Ipopt. It just puts a timer around the solve call.