HDF5: varying size dataset

I’m struggle with using HDF5.

I’m trying to append data in .h5 file.
So, I wanna make a dataset whose prescribed size is unlimited (on the first axis, specifically).
For example, receiving data with dims (5,6), then append the existing array with size (n, 5, 6) to be (n+1, 5, 6).

I’ve read this docs, but I don’t understand what I have to do for that.

Here is an example code:

function test_hdf5()
    # using HDF5 (done at the beginning of this file)

    path = "data/tmp.h5"
    name = "asdf"
    val = zeros(1, 5, 6)

    h5open(path, "w") do dummy
    end  # generate `data/tmp.h5`
    h5open(path, "r+") do h5file
        dset = d_create(
                        h5file,
                        name,
                        Float64,
                        (size(val), (-1, size(val)[2:end]...)),  # (1)
                        "chuck",  # (2)
                        size(val),  # (3)
                       )
    end
end
  1. do I have to write dims and max_dims (see (1))? Can’t I just write information like (typemax(Int64), 5, 6)?
  2. Do I have to add “chuck”? and size(val)?
  3. To do what I need, what should be changed in the example code?

Sorry for my poor understand in use of HDF5.

I have NO idea but with some modifications it works (codes seems exactly the same to me though).

Anyway, this would help others to create a HDF5 dataset with multi-dimensional arrray data (very slightly different from example, which is of one-dimensional array).
Perhaps this post is helpful only for beginners like me :frowning: