How to release memory from jupyter notebook?

Here is a minimal example that reproduces my problem on a new JupyterLab kernel:
x = rand(2^29); varinfo()
shows the 4.00 Gb size of ‘x’. But
x = nothing
does not release this memory. varinfo shows that it is indeed 0.00 Gb, but it’s showing up with top and my system monitor.

What am I missing here? Thanks!

You can manually invoke the garbage collector with GC.gc(). It should release the memory.

3 Likes

You’re right! Thanks so much.

1 Like

The Jupyter notebook interface also stores a reference to the output value of every cell, so your giant array might still be stored as Out[n] where n is the cell number in which you computed x.

You can do empty!(Out) to clear that dictionary manually (I don’t know if this is actually a recommended thing to do, but it seems to work). Or you can restart the Jupyter kernel, but I assume that’s not what you actually want.

7 Likes

Sorry just to be clear, this is only about the output of a cell? I.e. if I have

df = CSV.read(myhugefile.csv, DataFrame)

data = df[1:10, 1:5]

df = nothing

in a cell then there would not be a reference to the full data read in from csv?

EDIT: I should probably just check this myself but unfortunately my computer is currently unusable due to processing some large csv files in a Jupyter notebook :slight_smile:

Yeah, as far as I know that’s correct. The Out dictionary just holds the value of the last expression in the cell (the same way all Julia blocks implicitly return the value of their last expression).

1 Like

Yes, it is documented: Using IJulia · IJulia

3 Likes