# JuMP vs SAS

New to Julia, so I apologize if this is simple…

I am evaluating new technologies to potentially replace SAS for some decision support tools. One in particular is a non-linear optimization of decent (but not huge) size. For the record, that runs in a few seconds (~5) and that time is acceptable to users.

Looking into JuMP I used the example in the documentation. While simplistic, an interior point algorithm is what we use in SAS – so this can give me apples to apples.

``````using Ipopt
using JuMP

model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)

@NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)
optimize!(model)
println("x = ", value(x), " y = ", value(y))

@constraint(model, x + y == 10)
optimize!(model)
println("x = ", value(x), " y = ", value(y))
``````

Running this I get

\$ time julia nlopt.jl

This program contains Ipopt, a library for large-scale nonlinear optimization.
Ipopt is released as open source code under the Eclipse Public License (EPL).

x = 0.9999999999999899 y = 0.9999999999999792
x = 2.701147124098218 y = 7.2988528759017814

real 0m45.834s
user 0m0.000s
sys 0m0.000s

Creating this problem in SAS (code just incase someone wants it):

``````proc optmodel;

var x init 0;
var y init 0;

min m=(1-x)**2 + 100 * (y - x**2)**2;
solve with nlp;
print x;
print y;

con x + y=10;
solve with nlp;
print x;
print y;

quit;
``````

Run time:

\$ time sas nlopt.sas

real 0m0.385s
user 0m0.000s
sys 0m0.015s

45 seconds in Julia vs 0.39 seconds in SAS. I’d love to save clients SAS licensing fees, but I am not sure I can justify that big of a time reduction. What can I do to speed this up?

1 Like

You’re largely measuring compilation time rather than execution. First of all try timing the solution using `BenchmarkTools` to see how that compares to SAS.

If it then turns out to be compilation time your options depend on your use case, e.g. if it is an option to keep a Julia process running you don’t really have an issue with compilation (as it’s a one time cost), while if you need to cold start every time you can look into precompiling via `PackageCompilerX`

EDIT: FWIW when benchmarking your example with `BenchmarkTools` I get roughly 0.01 seconds, so this should be competitive with SAS

6 Likes

To add the the previous answer, if your use-case is to launch a single script from the command line, then JuMP won’t be competitive with SAS due to the start-up and compilation costs.

If the script is to be run often, we typically recommend that you start a single Julia instance, pay the compilation once, and keep the instance running serving requests.

Ignoring the compilation costs, JuMP should be virtually the same speed as SAS.

2 Likes

Thanks. I’m not sure of the technology stack at this point. This would be a service but how it gets called is open for debate. Getting smart about the precompilation is something we will have to look at.

1 Like

@Dominic_Pazzula
1 welcome to Julia Discourse
2 When ppl benchmark program speeds they usually wanna put the program inside a function & separately do a warmup run to separate “compile time” from “run time”.
Consider below:

``````using BenchmarkTools
using Ipopt
using JuMP

function f()
model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)

@NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)
optimize!(model)
println("x = ", value(x), " y = ", value(y))

@constraint(model, x + y == 10)
optimize!(model)
println("x = ", value(x), " y = ", value(y))
end

@time f() #warmup: 15.044558 seconds (44.22 M allocations: 2.180 GiB, 7.24% gc time)
@time f() #0.012105 seconds (2.12 k allocations: 145.898 KiB)
@benchmark f() #

function g()
model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)

@NLobjective(model, Min, (5 - x)^2 + 100 * (y - x^2)^2)
optimize!(model)
println("x = ", value(x), " y = ", value(y))

@constraint(model, x + y == 10)
optimize!(model)
println("x = ", value(x), " y = ", value(y))
end

@time g() #0.077067 seconds (97.75 k allocations: 4.977 MiB)
@time g() #0.023970 seconds (2.50 k allocations: 162.836 KiB)
@benchmark g() #
``````

You only have to pay the price of compile the first time you open Julia REPL. The founders are well aware of this & reducing compiler latency is a top priority.
I personally only minded it my first few weeks w/ Julia.

3 Likes

@Dominic_Pazzula
Can you try benchmarking the optimization in SAS again (w/ a few runs) to compare times w/ Julia?

SAS:

ELAPSE : 5.60199999809265 seconds over 500 iterations
AVERAGE: 0.01120399999618

Julia:

``````function model1()
model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)

@NLobjective(model, Min, (1 - x)^2 + 100 * (y - x^2)^2)
optimize!(model)
#println("x = ", value(x), " y = ", value(y))
end

@benchmark model1()
``````

## minimum time: 5.224 ms (0.00% GC) median time: 6.867 ms (0.00% GC) mean time: 7.500 ms (0.00% GC) maximum time: 31.579 ms (0.00% GC)

samples: 665
evals/sample: 1

So faster than SAS 11.2 vs 7.5 ms. I tried to make the SAS bit apples to apples by redefining the objective each iteration but not starting/stopping `PROC OPTMODEL` and turning off all output.

As a frequent SAS user, let me just say that I am completely not surprised at these results.

This would be helpful for my needs; can you point to some background information on how to “serve requests”? I’m mainly used to running in REPL or else using C to call Julia code but this alternative would be better

1 Like

It depends heavily on the application/how you intend to deploy/etc.

I guess the main options are:

1. Have Julia periodically poll a directory for new files. When you want to run a job, write a new data file to the directory, wait for the process to poll the directory.
2. Another option that is probably possible, but I’ve never tried, is connecting from C to Julia via a socket: https://docs.julialang.org/en/v1/manual/networking-and-streams/#A-simple-TCP-example-1

Thanks!