Random Error: Unsupported feature Hess

Hello,

I’m writing a code that have a user-defined gradient function, and solving it with Ipopt.
The problem is that I need to run the file multiple times before it finally works.
Most of the time it just gives me the error:

ERROR: LoadError: Unsupported feature Hess

But when it randomly works, it works as expected.

This behavior can be reproduced by using the sample code in the NLP README page:

using JuMP, Ipopt

model = Model(with_optimizer(Ipopt.Optimizer))

my_square(x) = x^2
my_square_prime(x) = 2x
my_square_prime_prime(x) = 2

my_f(x, y) = (x - 1)^2 + (y - 2)^2
function ∇f(g, x, y)
    g[1] = 2 * (x - 1)
    g[2] = 2 * (y - 2)
end

JuMP.register(model, :my_f, 2, my_f, ∇f)
JuMP.register(model, :my_square, 1, my_square, my_square_prime,
              my_square_prime_prime)

@variable(model, x[1:2] >= 0.5)
@NLobjective(model, Min, my_f(x[1], my_square(x[2])))
JuMP.optimize!.(model);
objetivo = JuMP.objective_value.(model)
print(objetivo)

The stack trace is always the same:

ERROR: LoadError: Unsupported feature Hess
Stacktrace:
 [1] error(::String) at .\error.jl:33
 [2] initialize(::JuMP.NLPEvaluator, ::Array{Symbol,1}) at C:\Users\Seman\.julia\packages\JuMP\MuwSj\src\nlp.jl:317
 [3] optimize!(::Ipopt.Optimizer) at C:\Users\Seman\.julia\packages\Ipopt\OLtKb\src\MOI_wrapper.jl:614
 [4] optimize!(::MathOptInterface.Utilities.CachingOptimizer{Ipopt.Optimizer,MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}}) at C:\Users\Seman\.julia\packages\MathOptInterface\o3zZH\src\Utilities\cachingoptimizer.jl:165
 [5] optimize!(::MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{Ipopt.Optimizer,MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}},MathOptInterface.Bridges.AllBridgedConstraints{Float64}}) at C:\Users\Seman\.julia\packages\MathOptInterface\o3zZH\src\Bridges\bridgeoptimizer.jl:73
 [6] optimize!(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{JuMP.JuMPMOIModel{Float64}}}) at C:\Users\Seman\.julia\packages\MathOptInterface\o3zZH\src\Utilities\cachingoptimizer.jl:165
 [7] #optimize!#77(::Bool, ::Bool, ::Function, ::Model, ::Nothing) at C:\Users\Seman\.julia\packages\JuMP\MuwSj\src\optimizer_interface.jl:124
 [8] _broadcast_getindex at C:\Users\Seman\.julia\packages\JuMP\MuwSj\src\optimizer_interface.jl:101 [inlined]
 [9] getindex at .\broadcast.jl:515 [inlined]
 [10] copy at .\broadcast.jl:766 [inlined]
 [11] materialize(::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{0},Nothing,typeof(optimize!),Tuple{Base.RefValue{Model}}}) at .\broadcast.jl:756
 [12] top-level scope at none:0
 [13] include at .\boot.jl:317 [inlined]
 [14] include_relative(::Module, ::String) at .\loading.jl:1044
 [15] include(::Module, ::String) at .\sysimg.jl:29
 [16] include(::String) at .\client.jl:392
 [17] top-level scope at none:0
in expression starting at c:\Users\Seman\Downloads\gonopmac-master(1)\gonopmac-master\exercicio\teste.jl:22

I know we redirected you to post here, but anyway please cross-reference whenever the same issue is posted in multiple locations (https://github.com/JuliaOpt/JuMP.jl/issues/1735). The bug you found will be fixed with https://github.com/JuliaOpt/JuMP.jl/pull/1738.

1 Like