EXIT: Restoration Failed! with JuMP (0.21.4) and Ipopt (0.6.5)

I am really puzzled and also frustrated that no matter how simple the problem was, like one below, Ipopt returns
EXIT: Restoration Failed!

using JuMP
using Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)

@NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)

what did I do wrong? please help!

Your code works fine with JuMP (0.21.5) and Ipopt(0.6.5). Maybe you can update JuMP to version 0.21.5.


Yes, there was an issue with mumps. Please update

1 Like


just want to report back that the issue got resolved. But I am not sure how I did it.

Basically first I updated to JuMP(0.21.5) as advised but still had the same issue.

then, I tried to update all packages and build all package again, but got errors associated with CUDA* about missing library.

so I removed all CUDA* related stuff and install latest NVidia CUDA library. Then I reinstalled CUDA*
and rebuild all packages.

after that, Ipopt issue seems went away.

1 Like

Hi, now the EXIT: Restoration Failed issue was gone, but I am having strange optimal solution with the following example

using JuMP
using Ipopt

model = Model(Ipopt.Optimizer)
@variable(model, α, start=1.0 )
@variable(model, β, start=1.0 )

x = [1. 2.; 3. -1; 5.0 3.;6. 7.]

f(α, β) = maximum(sum([α β] .* x, dims=2))
g(α, β) = minimum(sum([α β] .* x, dims=2))
JuMP.register(model, :f, 2, f, autodiff=true)
JuMP.register(model, :g, 2, g, autodiff=true)
@NLobjective(model, Max, f(α, β) / g(α, β) )


Number of objective function evaluations = 368
Number of objective gradient evaluations = 12
Number of equality constraint evaluations = 0
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 0
Total CPU secs in IPOPT (w/o function evaluations) = 0.171
Total CPU secs in NLP function evaluations = 0.004

EXIT: Optimal Solution Found.

julia> value(α)
julia> value(β)
julia> objective_value(model)

however, this is definitely not good solution as even with start number 1, the objective could be 6.5. what is wrong with my formulation?

Ipopt is a solver for convex problems that are twice differentiable. Your problem doesn’t meet these requirements, and it has a divide-by-zero issue when g(a,b) = 0.

You should consider other ways of formulating this problem (e.g., as a MIP maximizing f(a,b) - g(a,b) where g(a, b) >= 0.000001 using this reformulation of max: 9 Mixed integer optimization — MOSEK Modeling Cookbook 3.3.0).

1 Like

very nice! really appreciated! Exactly what I have been looking for! :slight_smile: