I’m working on some data analysis, and I don’t have as much RAM as I’d ideally like. I’m building a variety of datasets by joining several together etc. I’d like to get a list of all objects in the global environment, filter them by Type, then calculate the size of each dataset, and output a table, so I can manually decide which datasets to delete partway through my analysis…
eh, don’t re-invent GC… and you likely can’t clean up variable this way.
if you need to do this kind of gymnastics you probably just want to doing more things lazily / on the fly instead of dumping dataset into in-memory representation (DataFrame)
It’s just hundreds of lines of reading in csv files as datasets, merging them, plotting things, running a few linear regressions, plotting other things… etc etc.
so is this actually a problem then? are you running into OOM? there’s no reason to prematurely free up memory (by OS) if your system memory usage % is low
Yeah systemd-oomd is killing codium and all its sub-programs (ie. julia) at the moment Julia is using 54% of my available RAM and swap is up to about 4GB out of 9.