New to Julia, so I apologize if this is simple…
I am evaluating new technologies to potentially replace SAS for some decision support tools. One in particular is a non-linear optimization of decent (but not huge) size. For the record, that runs in a few seconds (~5) and that time is acceptable to users.
Looking into JuMP I used the example in the documentation. While simplistic, an interior point algorithm is what we use in SAS – so this can give me apples to apples.
using Ipopt using JuMP model = Model(with_optimizer(Ipopt.Optimizer, print_level=0)) @variable(model, x, start = 0.0) @variable(model, y, start = 0.0) @NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2) optimize!(model) println("x = ", value(x), " y = ", value(y)) # adding a (linear) constraint @constraint(model, x + y == 10) optimize!(model) println("x = ", value(x), " y = ", value(y))
Running this I get
$ time julia nlopt.jl
This program contains Ipopt, a library for large-scale nonlinear optimization.
Ipopt is released as open source code under the Eclipse Public License (EPL).
For more information visit http://projects.coin-or.org/Ipopt
x = 0.9999999999999899 y = 0.9999999999999792
x = 2.701147124098218 y = 7.2988528759017814
Creating this problem in SAS (code just incase someone wants it):
proc optmodel; var x init 0; var y init 0; min m=(1-x)**2 + 100 * (y - x**2)**2; solve with nlp; print x; print y; con x + y=10; solve with nlp; print x; print y; quit;
$ time sas nlopt.sas
45 seconds in Julia vs 0.39 seconds in SAS. I’d love to save clients SAS licensing fees, but I am not sure I can justify that big of a time reduction. What can I do to speed this up?