Hi, now the EXIT: Restoration Failed issue was gone, but I am having strange optimal solution with the following example
using JuMP
using Ipopt
model = Model(Ipopt.Optimizer)
@variable(model, α, start=1.0 )
@variable(model, β, start=1.0 )
x = [1. 2.; 3. -1; 5.0 3.;6. 7.]
f(α, β) = maximum(sum([α β] .* x, dims=2))
g(α, β) = minimum(sum([α β] .* x, dims=2))
JuMP.register(model, :f, 2, f, autodiff=true)
JuMP.register(model, :g, 2, g, autodiff=true)
@NLobjective(model, Max, f(α, β) / g(α, β) )
optimize!(model)
output
Number of objective function evaluations = 368
Number of objective gradient evaluations = 12
Number of equality constraint evaluations = 0
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 0
Total CPU secs in IPOPT (w/o function evaluations) = 0.171
Total CPU secs in NLP function evaluations = 0.004
EXIT: Optimal Solution Found.
result:
julia> value(α)
-6.616112462703816e14
julia> value(β)
2.2053711899565838e14
julia> objective_value(model)
0.08333330892901637
however, this is definitely not good solution as even with start number 1, the objective could be 6.5. what is wrong with my formulation?