In R there is a setting, utils::memory.limit(20000), to let R use a quantity of memory larger than the RAM, if needed.
The swap file on the SSD is much slower but sometimes is the only option if you want to perform a complex calculation, for example fitting a regression model with a large dataset and many variables.
For example now I need to use the package MixedModels to fit a large dataset and I can only work with around 100000 rows, but I have 2 million. The same goes for complex bayesian or survival analysis packages.
Is there any option like that on Julia with Windows?
Yes, I asked about that some weeks ago but for now is not able to work with packages such as MixedModels. They told me only linear operations have been implemented.
Windows should (as all modern operating systems) handle this situation automatically i.e.partially/fully swapping out of ram program memory. By default, R uses only RAM (afaik), hence the option. However, Windows is notorious for its poor performance once the limits of RAM are exceeded.
The cheapest option over long term is to add more RAM Other options are (if possible), using a server with lots of ram or tweaking data types to reduce memory to the extreme i.e. lower precision float’s (i.e. Float16), unsinged integers etc.
How can I trick/force Julia into using the disk pagefile as if it were RAM memory?
What is the equivalent command on Julia to R’s utils::memory.limit(100000)?
I’m trying the code posted by palday linked above and when I generate datasets that need less 16GB of memory it works. But when it needs more it says “Out of memory”. Even though I have set a 128GB pagefile on my SSD.
But it seems Julia doesn’t wan’t to use it.