While developing a package, I noticed that a simulation that used to run without problems now takes a whopping 30+ gb. The script is rather simple, I use some packages and then execute a let block. There should be no leaked variables into global scope, as everything is enclosed within the let block. Yet at the end of the simulation, julia keeps taking up this amount of memory. The usual way to analyze allocations is to track them, but I’m more interested why they aren’t being garbage collected at the end of the simulation (even after GC.gc())
The packages it depend on do use some kind of caching, and the summarysizes of those caches add up to about 600 megabytes.
Here is the script in question (but I have simplified the simulation to be runable on my laptop, but still with way too high memory usage). To run it, one needs
I’m primarily interested in the memory usage after the simulation is done, as I at the very least expect this to be manageable.