Understanding when GC.gc() frees memory and doesn't

Consider the two code chunks below. When run in vscode, the first fails to free memory as reported by htop, while the second frees memory successfully. This is not just a reporting issue–if I create another large array in cell one, I can crash my computer by depleting free memory, while this does not happen in the case of two cells. Any ideas what is going on?

For a script, I cannot follow the two-cell strategy, so how can I force garbage collection?

## does not free memory
tseries = zeros(1024,1024,5,2000)
tseries = nothing
GC.gc()
sleep(5)
GC.gc()
tseries = zeros(1024,1024,5,2000) # crashes computer
## Two cells queued sequentially (no delay): does free memory
tseries = zeros(1024,1024,5,2000)
tseries = nothing
GC.gc()
##
GC.gc()
tseries = zeros(1024,1024,5,2000) # works fine

Did you try the first version (single cell) from the Julia REPL, i.e. outside of VS? For me, with 1.5.3, the first GC.gc() call definitely reclaims the unused array, looking at the memory usage of my system.