I have a simulation written in Julia that I have been developing and running for a few years. It is a rather complex simulation with runtimes measured in days when using multicore computers, and it relies mainly on Flux, OptimizationOptimJL in addition to LinearAlgebra and StaticArrays.
After upgrading to 1.11.x I noticed that the memory use of my simulations more than doubled (in the case where I used Distributed, every process would double in size eventually, but this also happens when running a single process). The increase in memory use is gradual from the start to end of the simulation, with no sudden jumps from what I can see. The outputs of the simulations when using release and lts are otherwise identical.
I have tried to dump the sizes of any arrays or other structures I use as the simulations run, but again, there is no difference between release and lts.
This leads me to think there is a memory leak somewhere in the release version of Julia. As of now I am running release=1.11.3 and lts=1.10.7.
How should I investigate this issue any further? Are there any flags or hidden options that can make Julia check for missing deallocations? There is really no way to do a minimal working example here as the simulation code base is quite large, but I would be open to showing off some of the core loops as they are of a managable size.