I implemented a complex neural net by using Knet/AutoGrad. Then, I realized that RAM usage of my experiment gets bigger and bigger slowly in the training even though it was a GPU experiment. In simpler models, I don’t have this kind of problem (constant, consistent memory usage after a while). I detected the section that causes the problem, but I still need to debug it in order to find what I do wrong and fix it. I tried the following so far (but nothing yet),
call gc() in somewhere relevant
used valgrind to check if there are any leakage
call whos() periodically and analyze memory usage
I just need some idea or tool for general debugging this kind of memory issues in Julia. Thanks a lot!
I had a similar-looking problem where memory use seemed to grow linearly with time. I managed to simplify the code to the point where I could reproduce the “memory leak” with a 3-line for loop. This was instrumental in helping me identify the problem.
Linking my solution on the off-chance that it’s relevant.