# Why ODE solver so slow?

using Plots; gr()
using Dates

T1 = now()

f(u,p,t) = 0.98u
u0 = 1.0
tspan = (0.0,10.0)
prob = ODEProblem(f,u0,tspan)

sol = solve(prob)

T2 = now()

println(T2-T1)


This code is taking 9-14 seconds in Julia run via VSCode. I’ve run two-dimensional ODEs in Matlab which took ~1 sec. Plz advice.

P.S. With plot generation, it takes around 2 minutes.

It compiles the first time it runs. Solving a second time is much faster:

julia> println(T2-T1)
1526 milliseconds

julia> T1 = now(); sol2 = solve(ODEProblem(f,u0,(0.0, 100.0))); println(now()-T1);
1 millisecond


On a more recent version of Julia, even the first run that compiles was about 1.5 seconds for me, although that’s still slower than your 1s for MATLAB.

2 Likes

I put your code in a file and the differential equation solving into a function:

using DifferentialEquations

f(u,p,t) = 0.98u

function bench()
u0 = 1.0
tspan = (0.0,10.0)
prob = ODEProblem(f,u0,tspan)
sol = solve(prob)
end
bench()


If I now run:

@time include("solve.jl")


I get the output:

 7.717988 seconds (14.98 M allocations: 1.093 GiB, 4.67% gc time, 16.49% compilation time)


which is indeed pretty long.

If you load the package DifferentialEuqations first you get:

@time using DifferentialEquations
@time include("solve.jl")


I get 6.1s for loading the package and 1.0s for executing the code.

So the problem here is the load time of the package, which is needed only once per Julia session.

You can make this a bit faster by using the package OrdinaryDiffEq instead:

using OrdinaryDiffEq

f(u,p,t) = 0.98u

function bench()
u0 = 1.0
tspan = (0.0,10.0)
prob = ODEProblem(f,u0,tspan)
sol = solve(prob, Tsit5())
end
bench()


This takes 5.0s in total, 4.2s of this is the package load time.

Matlab is loading core packages when you start Matlab, which is different from what Julia is doing.

My machine:

julia> versioninfo()
Julia Version 1.9.0
Commit 8e630552924 (2023-05-07 11:25 UTC)
Platform Info:
OS: Linux (x86_64-linux-gnu)
CPU: 8 × Intel(R) Core(TM) i7-10510U CPU @ 1.80GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-14.0.6 (ORCJIT, skylake)
Threads: 1 on 8 virtual cores


Try if Julia 1.9 improves things for you.

By the way, solving the differential equation after loading the package and compiling the code is fast:

julia> @time bench()
0.000078 seconds (140 allocations: 8.891 KiB)

5 Likes

I wouldn’t exactly encourage new users to get on Julia master, but…

julia> @time using OrdinaryDiffEq
1.663054 seconds (2.98 M allocations: 184.085 MiB, 5.81% gc time, 0.84% compilation time)

julia> versioninfo()
Julia Version 1.10.0-DEV.1288
Commit d55314c05e (2023-05-12 18:54 UTC)
Platform Info:
OS: Linux (x86_64-redhat-linux)
CPU: 8 × 11th Gen Intel(R) Core(TM) i7-1165G7 @ 2.80GHz

2 Likes

Thanx for replying. Well, in my case, every time I ran the code the run time was the same. I was running “julia File.jl”.

When Elrod said the second run is faster than the first run, he meant within 1 Julia session; the code block he posted has >julia in front of all the lines, that’s the Julia REPL. When you run julia File.jl in the command prompt, it opens a session, runs the file, then closes the session. When a session is closed, all the JIT-compiled code is thrown away, so the next julia File.jl run needs to redo the compilation and take up the same time.

Julia is still pretty unusual for using JAOT compilation, so it’s a common pain point for people who are used to easily saving compilation once for future runs. However, this system does give an unusual combination of flexibility and efficiency. Let’s say you make a package with a generic method for 100 users each with their own custom data type. With AOT compilation, you need to compile that method 100 times for each type, and each user needs to download 100x more code than they would personally ever use. With JAOT compilation, each user only compiles the method for their own type. This is part of why Julia is said to be very composable; people can mix custom methods and types from different packages via interfaces, and they mostly only compile for their own mix.

But back to the downsides, this does skew the Julia workflow toward working within a REPL session for a long time, rather than scripts and executables in the command line. For users who need to run the same code in different sessions and would benefit from saving some of their own compilation, PackageCompiler.jl is a good tool. For package developers who know that their users will often be using a specific mix of methods and types (like numerical code on floating point numbers), they can “precompile” that code, which has the same purpose as compilation in AOT compiled languages. In fact, you probably noticed precompilation occur when installing and using some packages for the first time; that is saved and loaded when you import packages in later sessions.

4 Likes

this is faster than restarting MATLAB every time I assume

6 Likes

Take a look at Nice workflows for using and developing Julia 1.9+

With Julia1.9 and using an environment that should be greatly reduced. (Still won’t take 1s for the reasons above).

Does it actually take only one second to run matlab File.m from the command line? Despite using Matlab for decades, I’ve actually never tried running Matlab scripts like that.

1 Like

For simple scripts, Julia is 45 times faster:

ufechner@ufryzen:~/repos/TurbineModels$time julia -e "println(1+1)" 2 real 0m0,090s user 0m0,110s sys 0m0,263s ufechner@ufryzen:~/repos/TurbineModels$ time matlab -batch "1+1"

ans =

2

real    0m4,053s
user    0m2,386s
sys     0m0,458s

2 Likes

And using the original example and the following Julia script (ode1.jl):

using OrdinaryDiffEq

f(y,p,t) = 0.98y
y0 = 1.0
tspan = (0.0,10.0)
prob = ODEProblem(f, y0, tspan)

sol = solve(prob, Tsit5())
println(sol.u)


Executed with:

time julia -e "include(\"ode1.jl\")"


and the following Matlab script, named ode1.m:

tspan = [0 10];
y0 = 1;
[t,y] = ode45(@(t,y) 0.98*y, tspan, y0);
y


Executed with:

time matlab -batch "run ode1.m"


I get 4,219s for Matlab and 2,975s for Julia, so Julia is still by a factor of 1.4 faster, even if you include the package load time.

2 Likes

Yes, it dz!

I meant that more complicated codes in Matlab takes at most 1 sec (to be on the safer side). Just now, I ran much more complicated code, and Matlab is taking 00:00:00 time. Thus, it’s in milliseconds probably, so does Python.

@soldin It is not clear to me what you mean. If you just talk about the execution time of solving an ODE, that is only 0.078 ms in Julia:

julia> @time bench()
0.000078 seconds (140 allocations: 8.891 KiB)


Julia is a just-in-time compiled language, so the first execution of a function in a new session will include the compile time, but for large computations this is irrelevant. So please be clear what you are taking about, the start-up time including the loading of the required packages, or the execution time of your own code or of functions like the ODE solver…

To make my point more clear, the following example:

using OrdinaryDiffEq

function bench()
f(y,p,t) = 0.98y
y0 = 1.0
tspan = (0.0,10.0)
prob = ODEProblem(f, y0, tspan)

sol = solve(prob, Tsit5())
println(sol.u)
end

@time bench()
@time bench()


As output you get:

julia> include("ode2.jl")
...
0.373430 seconds (1.13 M allocations: 75.310 MiB, 4.18% gc time, 99.51% compilation time)
...
0.000139 seconds (319 allocations: 19.859 KiB)


The first time includes the compilation time, the second time not.

2 Likes

I’m afraid I’m just as confused now. How did you run that code, from within Matlab, or does that time include Matlab startup?

Yes, I see that now. Perhaps on my end (VSCode), it’s compiling every time the code is run.

I ran another code which had 25 coupled equations (5x5 grid) in the Matlab IDE.

The point is that if you want a fair comparison, you either have to run both the Julia and the Matlab code from within their IDE/REPL, or include startup time for both.

3 Likes

I see.