I’m in a situation where I’d like to solve a lot of small nonlinear programming problems as quickly as possible. I’ve been surprised to find that JuMP model construction is much slower than I expected. I have a very small problem with 4 variables and 3 constraints, but JuMP model construction takes >4.6 seconds:
import Ipopt
import JuMP
import MathOptInterface: MAX_SENSE
@time begin
model = JuMP.Model(Ipopt.Optimizer)
x_lo = JuMP.@variable(model, lower_bound=-2.0, upper_bound=0.1, start=0.1)
x_hi = JuMP.@variable(model, lower_bound=0.1, upper_bound=2.0, start=0.1)
y_lo = JuMP.@variable(model, lower_bound=-2.0, upper_bound=1.1, start=1.1)
y_hi = JuMP.@variable(model, lower_bound=1.1, upper_bound=2.0, start=1.1)
JuMP.add_NL_constraint(model, :(1.0 - ($(x_lo) * $(x_lo) + $(y_lo) * $(y_lo)) <= 0.0))
JuMP.add_NL_constraint(model, :(0.0 - $(x_lo) <= 0.0))
JuMP.add_NL_constraint(model, :(0.0 - $(y_lo) <= 0.0))
JuMP.set_NL_objective(model, MAX_SENSE, :(($(x_hi) - $(x_lo)) * ($(y_hi) - $(y_lo))))
end
Granted, this may look a bit odd because I’m programmatically constructing these problems.
Ultimately, my goal is to be able to construct and solve a problem this small in <10ms. What is taking so long here?
odow
June 17, 2021, 9:47pm
2
JuMP has a time-to-first-solve issue:
https://jump.dev/JuMP.jl/dev/tutorials/Getting%20started/performance_tips/#The-"time-to-first-solve"-issue
if you have to solve lots of small problems, you only pay the 4 seconds on the first one. The rest are very quick.
julia> import Ipopt
julia> using JuMP
julia> function main()
model = JuMP.Model(Ipopt.Optimizer)
x_lo = JuMP.@variable(model, base_name="x_lo", lower_bound=-2.0, upper_bound=0.1, start=0.1)
x_hi = JuMP.@variable(model, base_name="x_hi", lower_bound=0.1, upper_bound=2.0, start=0.1)
y_lo = JuMP.@variable(model, base_name="y_lo", lower_bound=-2.0, upper_bound=1.1, start=1.1)
y_hi = JuMP.@variable(model, base_name="y_hi", lower_bound=1.1, upper_bound=2.0, start=1.1)
JuMP.add_NL_constraint(model, :(1.0 - ($(x_lo) * $(x_lo) + $(y_lo) * $(y_lo)) <= 0.0))
JuMP.add_NL_constraint(model, :(0.0 - $(x_lo) <= 0.0))
JuMP.add_NL_constraint(model, :(0.0 - $(y_lo) <= 0.0))
JuMP.set_NL_objective(model, MOI.MAX_SENSE, :(($(x_hi) - $(x_lo)) * ($(y_hi) - $(y_lo))))
return model
end
main (generic function with 1 method)
julia> @time main();
4.928089 seconds (1.51 M allocations: 86.285 MiB, 0.68% gc time, 99.97% compilation time)
julia> @time main();
0.000409 seconds (1.94 k allocations: 113.305 KiB)
julia> @time main();
0.000316 seconds (1.94 k allocations: 113.305 KiB)
3 Likes
I see… My problems do not all have the same structure (different numbers of constraints, etc). Does the warmup still help then?
odow
June 17, 2021, 9:53pm
4
My problems do not all have the same structure (different numbers of constraints, etc). Does the warmup still help then?
It should be fine. The cost is compilation of functions based on their types. Not the numerical structure of the problems. Try it out and report any problems.
4 Likes
odow
June 17, 2021, 9:59pm
5
I’ll also say that if you just have quadratic expression, you can use @objective
and @constraint
. No need for this NL stuff.
julia> import Ipopt
julia> using JuMP
julia> function main2()
model = Model(Ipopt.Optimizer, bridge_constraints = false)
@variables(model, begin
-2.0 <= x_lo <= 0.1, (start = 0.1)
0.1 <= x_hi <= 2.0, (start = 0.1)
-2.0 <= y_lo <= 1.1, (start = 1.1)
1.1 <= y_hi <= 2.0, (start = 1.1)
end)
@constraints(model, begin
1 <= x_lo^2 + y_lo^2
x_lo >= 0
y_lo >= 0
end)
@objective(model, Max, (x_hi - x_lo) * (y_hi - y_lo))
return model
end
main2 (generic function with 1 method)
julia> @time main2();
4.910041 seconds (1.24 M allocations: 70.148 MiB, 0.65% gc time, 99.97% compilation time)
julia> @time main2();
0.000204 seconds (1.38 k allocations: 97.477 KiB)
julia> @time main2();
0.000206 seconds (1.38 k allocations: 97.477 KiB)
1 Like
Thanks! I have some problems that will automatically be constructed with nonlinear constraints, so I don’t know ahead of time whether or not it’s going to be friendly or not. But this is still good to know!
1 Like