Hi All,
I’m pretty new to Julia and have been playing around with a few packages that are available; specifically, Agents.jl
and linking it to DifferentialEquations.jl
. I’ve been following the example given here:
Running that example as is, I get a decent performance using @btime
; for 10,000 agents @btime
returns a simulation time of 765ms
(approximately) and I see about 3%
memory usage on my desktop, which has 32Gb
. However, I see a lot of warnings from DifferentialEquations.jl
about an interruption because Large MaxIters needed
, or something to that effect. Never-the-less, I started playing around to see what the performance would be like if I made the DifferentialEquations.jl
integrator specific to the agents. In the example above, it is quite a simple change: the :stock
and :i
values of the model properties in the function initialise_diffeq
become dictionaries
model = ABM(
Fisher;
properties = Dict(
:stock => Dict{Int, Float64}(),
:max_population => max_population,
:min_threshold => min_threshold,
:i => Dict{Int, OrdinaryDiffEq.ODEIntegrator}()
)
)
then below the for-loop which adds the agents, I put another for-loop which assigns the integrator and stock to an agent ID
for id in allids(model)
prob = OrdinaryDiffEq.ODEProblem(fish_stock!, [stock], (0.0, Inf), [max_population, 0.0])
integrator = OrdinaryDiffEq.init(prob, OrdinaryDiffEq.Tsit5(); advance_to_tstop = true)
model.stock[id] = stock
model.i[id] = integrator
end
Then the function model_diffeq_step!
is changed to loop over all the agent IDs:
function model_diffeq_step!(model)
for id in allids(model)
# We step 364 days with this call.
OrdinaryDiffEq.step!(model.i[id], 364.0, true)
# Only allow fishing if stocks are high enough
model.i[id].p[2] = model.i[id].u[1] > model.min_threshold ? model.agents[id].yearly_catch : 0.0
# Notify the integrator that conditions may be altered
OrdinaryDiffEq.u_modified!(model.i[id], true)
# Then apply our catch modifier
OrdinaryDiffEq.step!(model.i[id], 1.0, true)
# Store yearly stock in the model for plotting
model.stock[id] = model.i[id].u[1]
# And reset for the next year
model.i[id].p[2] = 0.0
OrdinaryDiffEq.u_modified!(model.i[id], true)
end
end
There are no other changes to the example given above. Running this through @btime
with 10,000 agents removes the DifferentialEquations.jl
warnings about Large MaxIters
but takes a lot longer to complete; @btime
returns about 1m 30s
, which is a significant increase on the 765ms
given previously. However, the main issue is the memory usage - it peaks at about 50%
of my memory (although it does fluctuate significantly). Given that an equivalent piece of C/C++ code has less than 0.1%
of my memory, 50%
is huge.
Although this example is incredibly contrived, I’m envisaging a scenario when each agent may require a copy of the same ODE but have different parameters. I thought the ODEIntegrator
would be relatively lightweight, but it doesn’t appear to be - although, I suppose that could be related to a choice of algorithm (I did try an RK4
but found the same behaviour).
As I’m new to Julia, I’m struggling to understand the performance here and how to improve it. Chances are I can make changes to the model setup, but without understanding the performance issues, it’s a bit difficult to see how to do that.
Thanks in advance.