I would like to solve a bilevel optimization problem in a dumb way.
Consider the following (as an example)
which is similar to the example on the doc page for BilevelJuMP about Non Linear Models. (I changed it slightly to match my use case that is lower-objective depending also on x)
This can be solved using BilevelJuMP.jl as explained on that page, with the following code:
using BilevelJuMP,Ipopt
model = BilevelModel(Ipopt.Optimizer, mode = BilevelJuMP.ProductMode(1e-5))
@variable(Upper(model), x >= 2)
@variable(Lower(model), 3 <= y <= 5)
# upper
@NLobjective(Upper(model), Min, x^4 - sin(y))
@NLconstraint(Upper(model), x^3 + y^3 <= 1000)
# lower
@objective(Lower(model), Min, y^2 + x)
optimize!(model)
value.([x,y])
(BilevelJuMP only accepts nonlinear objectives and constraints in the upper-level problem, while in the lower-level, it only accepts linear o quadratic objectives and constraints)
I would like to try to solve this problem using only JuMP in the dumbest possible way, that is: writing a function that for every value of x solves the lower level problem and returns the corresponding y, and then plug this into the upper level optimization.
Something on the line of this block of code:
# by hand
using JuMP,Ipopt
function lower(x)
lowerModel = Model(Ipopt.Optimizer)
set_silent(lowerModel)
@variable(lowerModel,y)
@constraint(lowerModel, y †5)
@constraint(lowerModel, 3 †y)
@objective(lowerModel,Min,y^2 + x)
optimize!(lowerModel)
yVal = value(y)
return yVal
end
upperModel = Model(Ipopt.Optimizer)
register(upperModel, :lower, 1, lower; autodiff = true)
@variable(upperModel, x)
@constraint(upperModel, x â„ 2)
@NLconstraint(upperModel, x^3 + lower(x)^3 †1000)
@NLobjective(upperModel, Min, x^4 - sin(lower(x)))
optimize!(upperModel)
value(x)
However, this doesnât work. I have never been able to register the function lower
in the upper model, so I cannot proceed. It returns the error:
julia> register(upperModel, :lower, 1, lower; autodiff = true)
ERROR: Unable to register the function :lower.
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:35
[2] _validate_register_assumptions(f::typeof(lower), name::Symbol, dimension::Int64)
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/operators.jl:327
[3] MathOptInterface.Nonlinear._UnivariateOperator(op::Symbol, f::Function)
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/operators.jl:340
[4] register_operator(registry::MathOptInterface.Nonlinear.OperatorRegistry, op::Symbol, nargs::Int64, f::Function)
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/operators.jl:399
[5] register_operator
@ ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/model.jl:217 [inlined]
[6] register(model::Model, op::Symbol, dimension::Int64, f::Function; autodiff::Bool)
@ JuMP ~/.julia/packages/JuMP/jZvaU/src/nlp.jl:716
[7] top-level scope
@ REPL[135]:1
caused by: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
Closest candidates are:
(::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:200
(::Type{T})(::T) where T<:Number at boot.jl:772
(::Type{T})(::AbstractChar) where T<:Union{AbstractChar, Number} at char.jl:50
...
Stacktrace:
[1] convert(#unused#::Type{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ Base ./number.jl:7
[2] setproperty!(x::AffExpr, f::Symbol, v::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ Base ./Base.jl:39
[3] add_to_expression!(aff::AffExpr, other::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ JuMP ~/.julia/packages/JuMP/jZvaU/src/aff_expr.jl:424
[4] add_to_expression!(quad::QuadExpr, other::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ JuMP ~/.julia/packages/JuMP/jZvaU/src/quad_expr.jl:308
[5] +(lhs::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1}, rhs::QuadExpr)
@ JuMP ~/.julia/packages/JuMP/jZvaU/src/operators.jl:64
[6] +(lhs::QuadExpr, rhs::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ JuMP ~/.julia/packages/JuMP/jZvaU/src/operators.jl:366
[7] add_mul(a::QuadExpr, b::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/h0wjj/src/MutableArithmetics.jl:26
[8] operate(::typeof(MutableArithmetics.add_mul), ::QuadExpr, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/h0wjj/src/interface.jl:206
[9] operate_fallback!!(::MutableArithmetics.IsNotMutable, ::typeof(MutableArithmetics.add_mul), ::QuadExpr, ::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/h0wjj/src/interface.jl:586
[10] operate!!(op::typeof(MutableArithmetics.add_mul), x::QuadExpr, args::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/h0wjj/src/rewrite.jl:89
[11] macro expansion
@ ~/.julia/packages/MutableArithmetics/h0wjj/src/rewrite.jl:322 [inlined]
[12] macro expansion
@ ~/.julia/packages/JuMP/jZvaU/src/macros.jl:1586 [inlined]
[13] lower(x::ForwardDiff.Dual{ForwardDiff.Tag{typeof(lower), Float64}, Float64, 1})
@ Main ./REPL[133]:7
[14] derivative
@ ~/.julia/packages/ForwardDiff/PcZ48/src/derivative.jl:14 [inlined]
[15] _validate_register_assumptions(f::typeof(lower), name::Symbol, dimension::Int64)
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/operators.jl:321
[16] MathOptInterface.Nonlinear._UnivariateOperator(op::Symbol, f::Function)
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/operators.jl:340
[17] register_operator(registry::MathOptInterface.Nonlinear.OperatorRegistry, op::Symbol, nargs::Int64, f::Function)
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/operators.jl:399
[18] register_operator
@ ~/.julia/packages/MathOptInterface/tWT4o/src/Nonlinear/model.jl:217 [inlined]
[19] register(model::Model, op::Symbol, dimension::Int64, f::Function; autodiff::Bool)
@ JuMP ~/.julia/packages/JuMP/jZvaU/src/nlp.jl:716
[20] top-level scope
@ REPL[135]:1
Is there a workaround to make it work?
I know this is idiotic. But in a case where the problem is nonlinear in the lower part, but in principle very easy it may be possible to solve it even in the dumb way.
Thanks for the help.