JuliaDB out-of-memory computations


#1

Good morning,

I have been trying to use JuliaDB as an R replace for data wrangling. I have been impressed by the load speeds of binary files saved by JuliaDB but I can’t quite understand one thing: If JuliaDB’s load is intended for larger than RAM files but, if they are larger than RAM, how can I save them in first place? Because to be able to load them I have to save them using JuliaDB which means I have to load them to RAM.

Thanks in advance.


#2

Interesting. Perhaps you wanna test out disk.frame https://github.com/xiaodaigh/disk.frame and let me know how it compares to JuliaDB?


#3

Is there a particular package for out of memory operations in R that you use?

DataFrames is a good choice if you dont need out of memory abilities.