Hello,
I have a simulation that is running in real-time for an undefined period of time (until you stop it).
I solve my equation system every 50ms. For one solution about 800 calls to the inner residual function are needed. This works fine.
The problem I have is that the time needed for the garbage collection is continuously increasing.
I do an incremental garbage collection once per time step like this:
t_gc_tot = @elapsed GC.gc(false)
After 300s of simulation time already 12ms time are needed for the garbage collection, and if you extrapolate after 1200s the simulation will fail because most of the time is used for gc.
What can I do to solve this issue?
I have zero allocations in my simulation code, but the solver (IDA solver from Sundials with GMRES linear solver) is allocating a lot.
It would be important to understand if the amount of memory stays constant or also increases evermore. The GC should only increase as function of heap_size and object-graph size. So this sounds like something is allocating more and more, but never releasing any of that memory,