JuliaDB out-of-memory computations

Good morning,

I have been trying to use JuliaDB as an R replace for data wrangling. I have been impressed by the load speeds of binary files saved by JuliaDB but I can’t quite understand one thing: If JuliaDB’s load is intended for larger than RAM files but, if they are larger than RAM, how can I save them in first place? Because to be able to load them I have to save them using JuliaDB which means I have to load them to RAM.

Thanks in advance.

Interesting. Perhaps you wanna test out disk.frame GitHub - DiskFrame/disk.frame: Fast Disk-Based Parallelized Data Manipulation Framework for Larger-than-RAM Data and let me know how it compares to JuliaDB?

Is there a particular package for out of memory operations in R that you use?

DataFrames is a good choice if you dont need out of memory abilities.