Tell Julia to use more memory than physical RAM, swapping

Hello.

In R there is a setting, utils::memory.limit(20000), to let R use a quantity of memory larger than the RAM, if needed.
The swap file on the SSD is much slower but sometimes is the only option if you want to perform a complex calculation, for example fitting a regression model with a large dataset and many variables.
For example now I need to use the package MixedModels to fit a large dataset and I can only work with around 100000 rows, but I have 2 million. The same goes for complex bayesian or survival analysis packages.

Is there any option like that on Julia with Windows?

I don’t know about the memory limit but perhaps you would be interested in OnlineStats which is used to analyze datasets larger than memory.

1 Like

Yes, I asked about that some weeks ago but for now is not able to work with packages such as MixedModels. They told me only linear operations have been implemented.

Windows should (as all modern operating systems) handle this situation automatically i.e.partially/fully swapping out of ram program memory. By default, R uses only RAM (afaik), hence the option. However, Windows is notorious for its poor performance once the limits of RAM are exceeded.

The cheapest option over long term is to add more RAM :wink: Other options are (if possible), using a server with lots of ram or tweaking data types to reduce memory to the extreme i.e. lower precision float’s (i.e. Float16), unsinged integers etc.

1 Like

Any news about this?
Many functions produce an error if they need more memory than RAM. For example
https://github.com/JuliaStats/MixedModels.jl/issues/160#issuecomment-535957211

How can I trick/force Julia into using the disk pagefile as if it were RAM memory?
What is the equivalent command on Julia to R’s utils::memory.limit(100000)?

Julia uses the normal OS swapping behaviour.
If it runs out of RAM then either:

  1. You have exceeded to size of your swap/virtual memory (This is an OS setting), or
  2. You have attempted to allocate a single object larger than RAM (e.g. a 100million by 100million dense Array)
  3. There is a bug. If so then the discussion belongs in a GitHub issue.

There is no such settings in Julia because Julia always does this

2 Likes

I’m trying the code posted by palday linked above and when I generate datasets that need less 16GB of memory it works. But when it needs more it says “Out of memory”. Even though I have set a 128GB pagefile on my SSD.
But it seems Julia doesn’t wan’t to use it.

Then probably option 3, its a bug.
Open an issue on:

1 Like