Hi how are you,
I have a dictionary where all the values are dataframes. I want to be able to save/export this dictionary of dataframes and import it in another file.
Whats the best way to do this?
Hi how are you,
I have a dictionary where all the values are dataframes. I want to be able to save/export this dictionary of dataframes and import it in another file.
Whats the best way to do this?
realistically, save each data-frame as a .csv in a subfolder and then write a function that reads all the .csv files into separate data frames and creates a dict from that.
Assuming you only need to access these data from another julia session and not from another platform, you could just save the whole dictionary in a jld2 file. Or is use of JLD2 not recommended with DataFrames? If so, why is that?
@Julia1: You can use JLD2 package(GitHub - JuliaIO/JLD2.jl: HDF5-compatible file format in pure Julia) to save and load files by using command in REPL
]add JLD2
Then you can use it by using
using JLD2, FileIO
X = Dict(1 => [0,0,0],
2 => [1,0,0],
3 => [5,3,0],
4 => [6,5,1])
@save "example.jld2" X
and you can load saved variables by using following command:
@load "example.jld2"
In my, admittedly not very extensive, experience: JLD2 seems to work reliably when the files are not too large and/or when the files only contain “built-in” Julia objects.
I just had a case where I saved a Dict
where each entry was a simple Dict{Symbol, Matrix{Float64}
. I have not found a way of reliably reading those files from disk when the “outer” Dict
has a couple of thousand entries. A number of different errors were thrown.
Small files containing user defined objects have so far worked reliably.
Again, others likely have more extensive experience. Perhaps my experience is an outlier. I would love to know about tricks that make loading reliable. It would save me a lot of headaches.
For now I switched to saving everything that I cannot lose with JSON3.jl
.