I have a module that I’ve developed (it’s a simulation for a scientific experiment).
The module basically looks like this:
module thvaccine
using Distributions
using Parameters
using Random
using DataFrames
using Distributed
import Base: show
include("./parameters.jl")
export P, humans, function1, function2
const P = ModelParameters()
const humans = Array{Human}(undef, gridsize)
## function1(), function2() and so on
end
Basically, my module depends on other modules (using
statements), it has an include
statement that brings in functions and types defined in another file, i have some exports, and I have a constant humans
array which is an array of 10,000 elements of a Human
type (with sizeof(thvaccine.Human) = 64
).
My question is about general memory usage. When I launch Julia with julia --project=. --startup-file=no
, I see that it’s using about 85 megabytes of memory. Then I import my module using thvaccine
, I see my memory has gone up to 232 megabytes.
What is causing the memory to go up so significantly? I have basically one “global” variable humans
which by running sizeof(humans) = 80000
so it’s only 0.08 megabytes.
Is my understanding correct that the memory usage I am seeing is the usage of all the dependent packages, all the variables defined in my functions (i.e. stack variables?), the compiled code, and general overhead?
The reason I am concerned about this is that I am trying to run multiple simulations on parallel thread. If I run @everywhere using thvaccine
, I can’t saturate my cores because i don’t have enough memory.