Let’s say I have a matrix stored in a file and I want to add columns over time. I want to add them in chunks so that I do not open/close the file all the time and I also want compression.
HDF5 allows me to do so for basic types (i.e. eltype Float64) with an initial d_create
and set_dims!
magic afterwards (see doc and search for set_dims!
). Of course, I can do exactly the same for JLD files, because they are basically HDF5.
Excerpt from HDF5.jl doc:
b = d_create(fid, "b", Int, ((1000,),(-1,)), "chunk", (100,)) #-1 is equivalent to typemax(Hsize)
set_dims!(b, (10000,))
b[1:10000] = collect(1:10000)
However, let’s say the matrix has eltype Complex128. Plain HDF5 doesn’t know complex numbers. This is where JLD is convenient. But I can’t apply the d_create
/set_dims!
strategy here because those functions are inherited from HDF5.jl and don’t allow for datatype Complex128.
Is there a way to get this (incrementally growing datafield) done nicely with JLD?