Hi Everyone,
I have an optimization problem similar to this one, except my objective is now the maximum element in a vector (which I want to minimize). Can JuMP handle this as an objective, and if so, what’s the best way to implement this?
Hi Everyone,
I have an optimization problem similar to this one, except my objective is now the maximum element in a vector (which I want to minimize). Can JuMP handle this as an objective, and if so, what’s the best way to implement this?
Could you introduce an auxiliary variable, say c, add a set of conditions x_i \leq c for each variable x_i, and set the objective as \min c?
We have special support for it with the MOI.NormInfinityCone
so you can do
@variable(model, c)
@constraint(model, [c; x] in MOI.NormInfinityCone(length(x) + 1))
@objective(model, Min, c)
This will be reformulated to a formulation equivalent to what @barucden suggests except if you use the solver Hypatia which supports this cone natively.
Thanks to both @barucden and @blegat! I also have a constraint where I have a variable @variable(model, υ[1:(T+1)])
, where I want to have a constraint where the count of variables > 0 is less than some maximum value. I tried to set an indicator @variable(model, v[1:(T+1)], Bin)
and add constraints:
@constraint(model, [t=1:(T+1)], v[t] --> {υ[t]==υ_max})
@constraint(model, [t=1:(T+1)], !v[t] --> {υ[t]==0})
@constraint(model, sum(v) ≤ v_max)
My code gives an error Unable to use IndicatorToMILPBridge because element 2 in the function has a non-finite domain: 0.0 + 1.0 MOI.VariableIndex(2003)
- can I check that what I’m doing above is correct?
The complete example is below:
using JuMP
using Ipopt
using Juniper
using Plots
## Settings
β = 0.5 # infectivity rate
γ = 0.25 # recovery rate
υ_max = 0.5 # maximum intervention
v_max = 10.0 # maximum duration of an intervention
silent = false
t0 = 0.0 # start time
tf = 100.0 # final time
δt = 0.1 # timestep
T = Int(tf/δt) # number of timesteps
S₀ = 0.99 # initial susceptible population
I₀ = 0.01 # initial infected population
## Set up model
ipopt = optimizer_with_attributes(Ipopt.Optimizer, "print_level"=>0)
optimizer = optimizer_with_attributes(Juniper.Optimizer, "nl_solver"=>ipopt)
model = Model(optimizer)
## Declare variables
@variable(model, S[1:(T+1)])
@variable(model, I[1:(T+1)])
@variable(model, υ[1:(T+1)])
@variable(model, v[1:(T+1)], Bin)
## Initial conditions
@constraint(model, S[1]==S₀)
@constraint(model, I[1]==I₀)
## Constraints on variables
@constraint(model, [t=2:(T+1)], 0 ≤ S[t] ≤ 1)
@constraint(model, [t=2:(T+1)], 0 ≤ I[t] ≤ 1)
## Constraints on control parameters
@constraint(model, [t=1:(T+1)], v[t] --> {υ[t]==υ_max})
@constraint(model, [t=1:(T+1)], !v[t] --> {υ[t]==0})
@constraint(model, δt*sum(v) ≤ v_max)
## Define auxiliary variables for infection and recovery
@NLexpression(model, infection[t=1:T], (1-exp(-(1 - υ[t]) * β * I[t] * δt)) * S[t])
@NLexpression(model, recovery[t=1:T], (1-exp(-γ*δt)) * I[t]);
## Set up timesteps
@NLconstraint(model, [t=1:T], S[t+1] == S[t] - infection[t])
@NLconstraint(model, [t=1:T], I[t+1] == I[t] + infection[t] - recovery[t])
## Minimize maximum of infection by defining an auxiliary variable
@variable(model, I_peak)
@constraint(model, 0.0 ≤ I_peak ≤ 1.0)
@constraint(model, [t=1:(T+1)], I[t] ≤ I_peak)
## Objective function
@objective(model, Min, I_peak)
if silent
set_silent(model)
end
optimize!(model)
The indicator constraints is reformulated using a big-M formulation so it needs bounds on the variables u
, e.g.
@variable(model, ... <= υ[1:(T+1)] <= ...)
Thanks! How do I specify the bounds on the vector of variables? I tried @variable(model, 0.0 ≤ υ[1:(T+1)] ≤ 1.0)
when declaring, but now I get an error ERROR: BoundsError: attempt to access 4005-element Vector{Float64} at index [1:6007]
. How does setting the bounds when declaring a variable differ from setting a constraint later on?
Weird, it’s working for me
julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> @variable(model, 0.0 ≤ υ[1:(T+1)] ≤ 1.0)
4-element Vector{VariableRef}:
υ[1]
υ[2]
υ[3]
υ[4]
If you write @constraint(model, 0 <= u[1] <= 1)
it is sent to the solver as an affine constraint, not as bounds on variables. You can use set_lower_bound
and set_upper_bound
to set bounds after the construction of variables.
When rewriting Indicator constraints using Big-M formulation, we don’t compute bounds on the variables using the affine constraints, we only use the variable bounds explicitly set by the user.
Sorry, I wasn’t being clear - I can declare the constraints on the variable, but I now get an error. Here is the code:
using JuMP
using Ipopt
using Juniper
β = 0.5 # infectivity rate
γ = 0.25 # recovery rate
υ_max = 0.5 # maximum intervention
v_max = 10.0 # maximum duration of an intervention
silent = false
t0 = 0.0 # start time
tf = 100.0 # final time
δt = 0.1 # timestep
T = Int(tf/δt) # number of timesteps
S₀ = 0.99 # initial susceptible population
I₀ = 0.01 # initial infected population
## Set up model
ipopt = optimizer_with_attributes(Ipopt.Optimizer, "print_level"=>0)
optimizer = optimizer_with_attributes(Juniper.Optimizer, "nl_solver"=>ipopt)
model = Model(optimizer)
## Declare variables
@variable(model, 0 ≤ S[1:(T+1)] ≤ 1)
@variable(model, 0 ≤ I[1:(T+1)] ≤ 1)
@variable(model, 0 ≤ υ[1:(T+1)] ≤ υ_max)
@variable(model, v[1:(T+1)], Bin)
## Initial conditions
@constraint(model, S[1]==S₀)
@constraint(model, I[1]==I₀)
## Constraints on control parameters
@constraint(model, [t=1:(T+1)], v[t] --> {υ[t] ≤ υ_max})
@constraint(model, [t=1:(T+1)], !v[t] --> {υ[t]==0})
@constraint(model, δt*sum(v) ≤ v_max)
## Define auxiliary variables for infection and recovery
@NLexpression(model, infection[t=1:T], (1-exp(-(1 - υ[t]) * β * I[t] * δt)) * S[t])
@NLexpression(model, recovery[t=1:T], (1-exp(-γ*δt)) * I[t]);
## Set up timesteps
@NLconstraint(model, [t=1:T], S[t+1] == S[t] - infection[t])
@NLconstraint(model, [t=1:T], I[t+1] == I[t] + infection[t] - recovery[t])
## Minimize maximum of infection by defining an auxiliary variable
@variable(model, 0 ≤ I_peak ≤ 1)
@constraint(model, [t=1:(T+1)], I[t] ≤ I_peak)
## Objective function
@objective(model, Min, I_peak)
optimize!(model)
The model only has 4005 parameters, but it looks like the model is trying to access more:
nl_solver : MathOptInterface.OptimizerWithAttributes(Ipopt.Optimizer, Pair{MathOptInterface.AbstractOptimizerAttribute, Any}[MathOptInterface.RawOptimizerAttribute("print_level") => 0])
feasibility_pump : false
log_levels : [:Options, :Table, :Info]
#Variables: 4005
#IntBinVar: 1001
Obj Sense: Min
******************************************************************************
This program contains Ipopt, a library for large-scale nonlinear optimization.
Ipopt is released as open source code under the Eclipse Public License (EPL).
For more information visit https://github.com/coin-or/Ipopt
******************************************************************************
ERROR: BoundsError: attempt to access 4005-element Vector{Float64} at index [1:6007]
Stacktrace:
[1] throw_boundserror(A::Vector{Float64}, I::Tuple{UnitRange{Int64}})
@ Base ./abstractarray.jl:737
[2] checkbounds
@ ./abstractarray.jl:702 [inlined]
[3] _copyto_impl!(dest::Vector{Float64}, doffs::Int64, src::Vector{Float64}, soffs::Int64, n::Int64)
@ Base ./array.jl:374
[4] copyto!
@ ./array.jl:368 [inlined]
[5] copyto!
@ ./array.jl:388 [inlined]
[6] _reverse_mode(d::MathOptInterface.Nonlinear.ReverseAD.NLPEvaluator, x::Vector{Float64})
@ MathOptInterface.Nonlinear.ReverseAD ~/.julia/packages/MathOptInterface/aJZbq/src/Nonlinear/ReverseAD/reverse_mode.jl:57
[7] eval_constraint_jacobian(d::MathOptInterface.Nonlinear.ReverseAD.NLPEvaluator, J::SubArray{…}, x::Vector{…})
@ MathOptInterface.Nonlinear.ReverseAD ~/.julia/packages/MathOptInterface/aJZbq/src/Nonlinear/ReverseAD/mathoptinterface_api.jl:226
[8] eval_constraint_jacobian(evaluator::MathOptInterface.Nonlinear.Evaluator{…}, J::SubArray{…}, x::Vector{…})
@ MathOptInterface.Nonlinear ~/.julia/packages/MathOptInterface/aJZbq/src/Nonlinear/evaluator.jl:165
[9] eval_constraint_jacobian(model::Ipopt.Optimizer, values::Vector{Float64}, x::Vector{Float64})
@ Ipopt ~/.julia/packages/Ipopt/bqp63/src/MOI_wrapper.jl:750
[10] (::Ipopt.var"#eval_jac_g_cb#6"{…})(x::Vector{…}, rows::Vector{…}, cols::Vector{…}, values::Vector{…})
@ Ipopt ~/.julia/packages/Ipopt/bqp63/src/MOI_wrapper.jl:828
[11] _Eval_Jac_G_CB(n::Int32, x_ptr::Ptr{…}, ::Int32, ::Int32, nele_jac::Int32, iRow::Ptr{…}, jCol::Ptr{…}, values_ptr::Ptr{…}, user_data::Ptr{…})
@ Ipopt ~/.julia/packages/Ipopt/bqp63/src/C_wrapper.jl:0
[12] IpoptSolve(prob::IpoptProblem)
@ Ipopt ~/.julia/packages/Ipopt/bqp63/src/C_wrapper.jl:442
[13] optimize!(model::Ipopt.Optimizer)
@ Ipopt ~/.julia/packages/Ipopt/bqp63/src/MOI_wrapper.jl:962
[14] optimize!(b::MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer})
@ MathOptInterface.Bridges ~/.julia/packages/MathOptInterface/aJZbq/src/Bridges/bridge_optimizer.jl:367
[15] solve_root_incumbent_model(jp::Juniper.JuniperProblem)
@ Juniper ~/.julia/packages/Juniper/HBPrQ/src/model.jl:58
[16] optimize!(model::Juniper.Optimizer)
@ Juniper ~/.julia/packages/Juniper/HBPrQ/src/MOI_wrapper/MOI_wrapper.jl:301
[17] optimize!
@ ~/.julia/packages/MathOptInterface/aJZbq/src/Bridges/bridge_optimizer.jl:367 [inlined]
[18] optimize!
@ ~/.julia/packages/MathOptInterface/aJZbq/src/MathOptInterface.jl:122 [inlined]
[19] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{…})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/aJZbq/src/Utilities/cachingoptimizer.jl:321
[20] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::@Kwargs{})
@ JuMP ~/.julia/packages/JuMP/7rBNn/src/optimizer_interface.jl:595
[21] optimize!(model::Model)
@ JuMP ~/.julia/packages/JuMP/7rBNn/src/optimizer_interface.jl:546
I can reproduce, so I’ll take a look: [Nonlinear.ReverseAD] BoundsError from user-code · Issue #2523 · jump-dev/MathOptInterface.jl · GitHub
Here’s how I would write your model though:
using JuMP
using Ipopt
using Juniper
β = 0.5
γ = 0.25
υ_max = 0.5
v_max = 10.0
silent = false
t0 = 0.0
tf = 100.0
δt = 0.1
T = round(Int, tf / δt)
S₀ = 0.99
I₀ = 0.01
ipopt = optimizer_with_attributes(Ipopt.Optimizer, "print_level" => 0)
optimizer = optimizer_with_attributes(Juniper.Optimizer, "nl_solver" => ipopt)
model = Model(optimizer)
@variables(model, begin
0 <= S[1:(T+1)] <= 1
0 <= I[1:(T+1)] <= 1
0 <= υ[1:(T+1)] <= υ_max
v[1:(T+1)], Bin
0 <= I_peak <= 1
end)
fix(S[1], S₀)
fix(I[1], I₀)
@expressions(model, begin
infection[t in 1:T], (1 - exp(-(1 - υ[t]) * β * I[t] * δt)) * S[t]
recovery[t in 1:T], (1 - exp(-γ * δt)) * I[t]
end)
@constraints(model, begin
[t in 1:T+1], υ[t] <= υ_max * v[t]
δt * sum(v) <= v_max
[t in 1:T], S[t+1] == S[t] - infection[t]
[t in 1:T], I[t+1] == I[t] + infection[t] - recovery[t]
[t in 1:T+1], I[t] <= I_peak
end)
@objective(model, Min, I_peak)
optimize!(model)
Thanks @odow, it’s good to know the idiomatic way to do these things. Happy to know I’m not making an obvious mistake!
I have identified a bug that happens when you mix the legacy nonlinear interface with a particular set of bridges (like the indicator-to-MILP bridge that is being used here).
Out of curiosity, this is a particular case of a low order value optimization problem. And one interesting property of these problems is that, despite the fact that they are non-differentiable, the non-differentiability is “benign,” which means that you can define the objective function in a straightforward way, and use the derivative of the objective function relative to the variable for which the function assumes the maximum value. Then you can use standard derivative-based optimization methods with convergence properties similar to usual differentiable problems.