Here is a minimal example that reproduces my problem on a new JupyterLab kernel: x = rand(2^29); varinfo()
shows the 4.00 Gb size of ‘x’. But x = nothing
does not release this memory. varinfo shows that it is indeed 0.00 Gb, but it’s showing up with top and my system monitor.
The Jupyter notebook interface also stores a reference to the output value of every cell, so your giant array might still be stored as Out[n] where n is the cell number in which you computed x.
You can do empty!(Out) to clear that dictionary manually (I don’t know if this is actually a recommended thing to do, but it seems to work). Or you can restart the Jupyter kernel, but I assume that’s not what you actually want.
in a cell then there would not be a reference to the full data read in from csv?
EDIT: I should probably just check this myself but unfortunately my computer is currently unusable due to processing some large csv files in a Jupyter notebook
Yeah, as far as I know that’s correct. The Out dictionary just holds the value of the last expression in the cell (the same way all Julia blocks implicitly return the value of their last expression).