Is it normal that memory usage accumulate when running loops (many iterations including nested loops) inside a function in julia? I’m using Julia-0.4.5. and I think there is garbage collection in julia that helps clean memory.
I wrote a toy example of my code structure for the problem I encountered when running large scale simulation.
include("fcts.jl") function sim(seed) #dummy=0 A=[12,5,6,8,15,9]; B=[4,12,19,5]; C=[10,3,9,17]; for iA=1:length(A), iB=1:length(B),iC=1:length(C) a=A[iA]*10^2; b=B[iB]*10^2; c=C[iC]*10^2 srand(seed) M=randn(a,b) # tic(); Method1(); T11=toq() # tic(); Method2(); T12=toq() # tic(); Method3(); T13=toq() N=randn(a,c) ## measure corresponding time K=randn(b,c) ## measure corresponding time ## save results for each iteration in case the simulation stops b/c of error # filePath = "..." # fp = open(filePath*string(myid())*".csv", "a") # writecsv(fp,hcat(T11,T12,T13,...,seed)) # close(fp) end end
My original simulation runs with parallel
pamp(sim,1:30). But as monitoring the memory usage with “top” command in terminal, I find the memory accumulate gradually and finally the job get killed when it goes beyond the limit.
Could anyone tell me if there is any obvious problem which causes the heavy memory usage?