I have written a piece of code consisting of hundreds of lines. Everything is defined in functions. When running this code, I basically calculate a temperature profile along a loop in time. After running this code and checking with @timev I get
0.556749 seconds (254.11 k allocations: 1.163 GiB, 37.83% gc time) elapsed time (ns): 556748781 gc time (ns): 210599823 bytes allocated: 1249258184 pool allocs: 149066 non-pool GC allocs:105042 GC pauses: 55 full collections: 1
I was shocked to see the memory allocation and the # of allocations. What I noticed is that if I run the loop in twice the number of time steps, the allocation is doubled as well. Hence the more time steps I run, the more memory is allocated.
My questions are: is the allocated memory the total amount of memory that my script requires while running? Or is memory overwritten each time step? And what do “pool allocs”, “non pool GC allocs” and “GC pauses” mean? I cannot find this information on internet.
Thanks for helping!