ODE.jl not efficient for Monte Carlo problem?

My code solves the Monte Carlo problem of a large ensemble of particles interacting with field. I use the Newton’s Equations to evolve the system. Numerically, ordinary equation is needed. Some pieces of my code are as follows:

function Eq1(t, y)
    (x, y, px, py) = y

    r = sqrt( x^2 + y^2 + s1 )

    x_prime = px
    y_prime = py

    px_prime = -2x/r^3
    py_prime = -2y/r^3
    [x_prime; y_prime; px_prime; py_prime]
end


function Init2()
    x = Array(Float64, ne)
    y = similar(x)
    px = similar(x)
    py = similar(x)
    rand1 = rand(ne)
    x .= R_init * sin(2pi.*rand1)
    y .= R_init * cos(2pi.*rand1)
    rand1 = rand(ne)
    px .= p_init * sin(2pi.*rand1)
    py .= p_init * cos(2pi.*rand1)
    println(R_init)
    println(p_init)
    println(s1)
    for i in 1:ne
        tout, yout = ode45(Eq1, [x[i], y[i], px[i], py[i]], [0.0; 100.0])#; points =:specified)        
        x[i] = yout[2][1]
        y[i] = yout[2][2]
        px[i] = yout[2][3]
        py[i] = yout[2][4]
        if mod(i, 1000) == 0
            println("i=$(i)")
        end
    end

Whe I set the parameters as

R0_init=0.5
p0_init=0.7869
s1 = 0.5
ne = 5000

the @time Inite() command give a result as:

i=1000
i=2000
i=3000
i=4000
i=5000

14.935459 seconds (267.48 M allocations: 8.997 GB, 4.63% gc time)

I have solved similar problem with Fortran and Matlab, both of them are much more efficient. I guess I have wrongly used the package, but I can only find very little details in the package github page. Can anyone tell me what the problem is and how can i fix it?

Try DifferentialEquations.jl

In the julia pakage list page, i find

Julia v0.4: 0.3.0 (3 months ago) / Tests fail.
Julia v0.5: 1.3.0 (6 days ago) / Tests fail.
Julia v0.6: 1.3.0 (6 days ago) / Tests fail.

I took the time to convert this to DifferentialEquations.jl

using DifferentialEquations

function Eq1(t,u,du)
    (x, y, px, py) = u

    r = sqrt( x^2 + y^2 + s1 )

    du[1] = px #x_prime
    du[2] = py #y_prime

    du[3] = -2x/r^3 #px_prime
    du[4] = -2y/r^3 #py_prime
    nothing
end

const R0_init=0.5
const p0_init=0.7869
const s1 = 0.5
const ne = 5000

function Init2()
  x = Array(Float64, ne)
  y = similar(x)
  px = similar(x)
  py = similar(x)
  rand1 = rand(ne)
  x .= R0_init * sin(2pi.*rand1)
  y .= R0_init * cos(2pi.*rand1)
  rand1 = rand(ne)
  px .= p0_init * sin(2pi.*rand1)
  py .= p0_init * cos(2pi.*rand1)
  println(R0_init)
  println(p0_init)
  println(s1)
  u0 =  [x[1], y[1], px[1], py[1]]
  prob = ODEProblem(Eq1,u0,(0.0, 100.0)) # Dummy allocate u0
  for i in 1:ne
      prob.u0[1] = x[i]; prob.u0[2] = y[i]; prob.u0[3] = px[i]; prob.u0[4] = py[i] # Best to not allocate
      sol = solve(prob, save_timeseries=false,dense=false) # Only need solution at the end
      x[i] = sol[2,1]
      y[i] = sol[2,2]
      px[i] = sol[2,3]
      py[i] = sol[2,4]
      if mod(i, 1000) == 0
          println("i=$(i)")
      end
  end
end

@time Init2()
@time Init2()

The second timing (so without compilation) gives me

0.998843 seconds (1.83 M allocations: 97.715 MB, 1.90% gc time)

This has one patch which helps performance (which is in the tag below). On the current release you’ll have closer to 2 seconds. Your code on my computer runs in 14 seconds via ODE.jl (and it has some minsteps errors every once in awhile? This is using ODE.ode45 with PR49). So using DifferentialEquations.jl should be a very large speedup.

For reference, how fast was MATLAB and Fortran? Maybe this can be optimized a little bit more.

I just put a few tags in. When these are merged the newest version will be out and tests shouldn’t fail on PkgEvaluator.

The “fail” on the current release is just due to a syntax change in Sundials, so the test fails but nothing in the package actually fails (see the test log for details).

4 Likes

Also, one thing not mentioned here is that Monte Carlo problems are embarassingly parallel. You may want to exploit this. In StochasticDiffEq.jl (part of DifferentialEquations.jl, for the stochastic differential equation solvers) there are functions for parallelizing the Monte Carlo simulation across a cluster. The code is actually pretty simple and has been tested to work well on 500+ cores on the XSEDE Comet cluster:

https://github.com/JuliaDiffEq/StochasticDiffEq.jl/blob/master/src/stochastic_utils.jl#L27

You can probably easily modify that idea to your problem (it’s just pmap, and then a conversion because pmap doesn’t do inference correctly here?). Also, this is the third time I’ve heard of someone doing this kind of Monte Carlo simulation on initial conditions / parameters, so maybe there should be a whole repo for parallelizing these kinds of calls (or a more complex function that will handle a bunch of different use cases). File an issue on the DifferentialEquations.jl if you’re interested in having something like this “out of the box”.

2 Likes

I have a more general comment on this problem: What you are solving is a Hamiltonian ODE (with separable Hamiltonian). It might be a good idea to substitute the RK45 for a symplectic integrator with fixed time step. Most likely, that will lead to more accurate results and be cheaper at the same time.

In that case you could also have a look at GeomDAE.jl (GitHub - JuliaGNI/GeometricIntegrators.jl: Geometric Numerical Integration in Julia), which might give you another order of magnitude or so in performance. Moreover, you can set all your initial conditions at once, so you save the loop over particles.

Although, as the main author of that package, I would not necessarily advise you to do so right now. The package is under heavy development so that breaking changes occur rather regularly. Then again, the basic interface for defining and solving equations should be rather stable.

3 Likes

If GeomDAE adds the common interface API, then say they implement the algorithm Symplectic, then the only line that needs to change in my code is:

sol = solve(prob, save_timeseries=false,dense=false)

to

sol = solve(prob, Symplectic(),save_timeseries=false,dense=false)

(and add a using statement). I opened an issue to discuss this.

Thanks for the post!
Would you know of any standard literature on formal frameworks for properties and proofs of integrators?
Thanks!

Yes, certainly. From my point of view, these are the most important ones:

  • Ernst Hairer, Christian Lubich and Gerhard Wanner. Geometric Numerical Integration. Springer, 2006.
  • Ernst Hairer, Syvert P. Nørsett and Gerhard Wanner. Solving Ordinary Differential Equations I: Nonstiff Problems. Springer, 1993.
  • Ernst Hairer and Gerhard Wanner. Solving Ordinary Differential Equations II: Stiff and Differential-Algebraic Problems. Springer, 1996.
  • John C. Butcher. Numerical Methods for Ordinary Differential Equations. Wiley, 2016.
4 Likes

This is golden. Thanks a lot!