# Ifelse statement leads to a length(hess_I) > 0 AssertionError

Hey there,

I’m new to JuMP but I need to solve an optimization problem which I am not being able to run.

Let’s say I have a circumference with a given radius of 5 and such a circumference is built from the a and b components. Let’s say that the a component has a positive cost of 1, while b has a negative cost of -2 when a given binary variable x is true and 0 when this x variable is false. I would like to find the a and b components which would maximize my revenue, so I would describe my problem as:

``````     Max:               -2 * b * x + a
Subject to: c^2 - (a^2 + b^2) = 0
c - 5 = 0
``````

If I leave my x variable fixed to 1, I’m able to run the code:

``````using JuMP, Juniper, Ipopt, GLPK
optimizer = Juniper.Optimizer
params = Dict{Symbol, Any}()
params[:nl_solver] = with_optimizer(Ipopt.Optimizer, print_level=0)
params[:mip_solver] = with_optimizer(GLPK.Optimizer)
m = Model(with_optimizer(optimizer, params))
@variable(m, a)
@variable(m, b)
@variable(m, c == 5)
@variable(m, x == 1, Bin)
@NLconstraint(m, c^2 == a^2 + b^2)
@objective(m, Max, a - 2 * b * x)
optimize!(m)
``````

I get the values a = 2.2360679774997894, b = -4.47213595499958 and obj = 11.180339887498949.

But now let’s say I want x to be true when a is smaller than 3:

``````using JuMP, Juniper, Ipopt, GLPK
optimizer = Juniper.Optimizer
params = Dict{Symbol, Any}()
params[:nl_solver] = with_optimizer(Ipopt.Optimizer, print_level=0)
params[:mip_solver] = with_optimizer(GLPK.Optimizer)
m = Model(with_optimizer(optimizer, params))
@variable(m, a)
@variable(m, b)
@variable(m, c == 5)
@variable(m, x, Bin)
@NLconstraint(m, c^2 == a^2 + b^2)
@NLconstraint(m, x == ifelse(a >= 3, 0, 1))
@objective(m, Max, a - 2 * b * x)
optimize!(m)
``````

Now I got an AssertionError:

ERROR: AssertionError: length(hess_I) > 0
Stacktrace:
[1] JuMP._FunctionStorage(::Array{JuMP._Derivatives.NodeData,1}, ::Array{Float64,1}, ::Int64, ::JuMP._Derivatives.Coloring.IndexedSet, ::Bool, ::Array{JuMP._Subexpress
ionStorage,1}, ::Array{Int64,1}, ::Array{JuMP._Derivatives.Linearity,1}, ::Array{Set{Tuple{Int64,Int64}},1}, ::Array{Array{Int64,1},1}, ::Dict{MathOptInterface.Variable
Index,Int64}) at /home/daniel/.julia/packages/JuMP/MsUSY/src/nlp.jl:283
[2] initialize(::NLPEvaluator, ::Array{Symbol,1}) at /home/daniel/.julia/packages/JuMP/MsUSY/src/nlp.jl:407
[3] optimize!(::Ipopt.Optimizer) at /home/daniel/.julia/packages/Ipopt/ruIXY/src/MOI_wrapper.jl:795
[4] optimize!(::MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}) at /home/daniel/.julia/packages/MathOptInterface/A2UPd/src/Bridges/bridge_optimizer.jl:2
25
[5] optimize!(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.
Model{Float64}}}) at /home/daniel/.julia/packages/MathOptInterface/A2UPd/src/Utilities/cachingoptimizer.jl:189
[6] #optimize!#78(::Bool, ::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(optimize!), ::Model, ::OptimizerFactory) at /home/da
niel/.julia/packages/JuMP/MsUSY/src/optimizer_interface.jl:141
[7] optimize! at /home/daniel/.julia/packages/JuMP/MsUSY/src/optimizer_interface.jl:111 [inlined]
[8] #optimize_get_status_backend#7(::OptimizerFactory, ::typeof(Juniper.optimize_get_status_backend), ::Model) at /home/daniel/.julia/packages/Juniper/i4qvA/src/util.j
l:280
[9] #optimize_get_status_backend at ./Base.jl:0 [inlined]
[10] solve_root_model!(::Juniper.JuniperProblem) at /home/daniel/.julia/packages/Juniper/i4qvA/src/model.jl:63
[11] optimize!(::Juniper.Optimizer) at /home/daniel/.julia/packages/Juniper/i4qvA/src/MOI_wrapper/MOI_wrapper.jl:331
[12] optimize!(::MathOptInterface.Bridges.LazyBridgeOptimizer{Juniper.Optimizer}) at /home/daniel/.julia/packages/MathOptInterface/A2UPd/src/Bridges/bridge_optimizer.j
l:225
[13] optimize!(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities
.Model{Float64}}}) at /home/daniel/.julia/packages/MathOptInterface/A2UPd/src/Utilities/cachingoptimizer.jl:189
[14] #optimize!#78(::Bool, ::Bool, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(optimize!), ::Model, ::Nothing) at /home/daniel/.ju
lia/packages/JuMP/MsUSY/src/optimizer_interface.jl:141
[15] optimize! at /home/daniel/.julia/packages/JuMP/MsUSY/src/optimizer_interface.jl:111 [inlined] (repeats 2 times)
[16] top-level scope at none:1

Am I missing something?

Many thanks!

EDIT 1: Forgot to say I’m using julia v1.3.0, GLPK v.0.12.0, Juniper v.0.5.2, Ipopt v.0.6.1 and JuMP v.0.20.1.

I can’t solve your problem for the moment, but may be able to put you on the right path at least.

AFAIK, you can’t use `ifelse` since it’s not in JuMP’s list of known derivative rules. However, you should be able to create a user defined function for this purpose.

Problem is, I can’t get that to work when Juniper is in use (I’ve never played with it before personally).

If I were to drop that dependency, this is how it would look like:

``````using JuMP
using Ipopt
less_than_three(a) = ifelse(a >= 3, 0., 1.)
m = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
register(m, :less_than_three, 1, less_than_three, autodiff=true)
@variable(m, a)
@variable(m, b)
@variable(m, c == 5)
@variable(m, x)
@NLconstraint(m, c^2 == a^2 + b^2)
@NLconstraint(m, x == less_than_three(a))
@objective(m, Max, a - 2 * b * x)
optimize!(m)
``````

Which is not precisely what you need of course. Turing Juniper & your other solvers back on, as well as returning `x` to its binary value (and setting the 0 and 1 in `less_than_three` to `Int`) is something that I expect to work, but instead:

So the user defined function no longer registers.
Unsure how you set up Juniper to allow this, but hopefully that gets you started on a solution, or points someone else in the right direction to assist further.

Udate: Actually, I found the solution buried in Juniper. You need to register the function in Juniper AND JuMP for it to work. Also, JuMP expects `Float64` as a return type of a user defined function, but that will cast to a `Bin` automatically.

So this should be the solution:

``````using Juniper
using Ipopt
using GLPK

less_than_three(a) = ifelse(a >= 3, 0., 1.)

optimizer = Juniper.Optimizer
params = Dict{Symbol, Any}()
params[:nl_solver] = with_optimizer(Ipopt.Optimizer, print_level=0)
params[:mip_solver] = with_optimizer(GLPK.Optimizer)
params[:registered_functions] = [Juniper.register(:less_than_three, 1, less_than_three, autodiff=true)]
m = Model(with_optimizer(optimizer, params))
JuMP.register(m,:less_than_three, 1, less_than_three, autodiff=true)
@variable(m, a)
@variable(m, b)
@variable(m, c == 5)
@variable(m, x, Bin)
@NLconstraint(m, c^2 == a^2 + b^2)
@NLconstraint(m, x == less_than_three(a))
@objective(m, Max, a - 2 * b * x)
optimize!(m)

``````
2 Likes

`ifelse` is supported. This is a bug, thanks for reporting. I opened https://github.com/JuliaOpt/JuMP.jl/issues/2115.

By the way, it’s generally a much better idea to model discrete nonlinearities like this with integer variables instead of `ifelse`. I’ll leave the details as a small exercise or for someone else to fill in.

2 Likes

Hey, @Libbum! Thank you very much. I think I can keep developing my model from your solution.

Hey, @miles.lubin; Thank you!

I’ve figured it was the case since I’ve seen an ifelse solution suggested elsewhere in this forum.

I’m too new to Linear and Non Linear Programming to figure out how to hand it. Hope a good soul is searching for a small exercise around.