function my_cat(X)
k = length(X)
k == 0 && return nothing
s = size(X[1])
T = eltype(X[1])
@assert length(s) == 2
@assert all(x -> s == size(x), X)
@assert all(x -> T == eltype(x), X)
n = prod(s)
Y_vec = Vector{T}(undef, k*n)
offset = 0
for x in X
@inbounds Y_vec[offset + 1: offset + length(x)] .= vec(x)
offset += length(x)
end
Y = reshape(Y_vec, size(X[1])..., 1, k)
return Y
end
x = [1 2; 3 4];
y = copy(x);
X = [x, y];
my_cat(X)
Correct. Because it’s image data with height width and colour channels; that makes 3 dims. But the data is just grey images so the colourchannel dim disappears
Assuming xlist = [x,y] are your images, and you really did mean 4th dimension not 3rd.
It’s just doing something like reduce(hcat,…), but the point is not to have to add a comment # axes are row,col,colour,number or a docstring to the function you write.
Edit: actually that’s quite slow on all 60000. But this is fast: lazy uses RecursiveArrayTools to make a lazy array, instead of reduce(cat,...).
Looks really good! This is such a great idea! Manipulating high dimensional arrays (tensors) need really good intuitive abstractions!
It would be good to include installation instructions using Pkg.add method as I am using juliabox.com and so it’s not as convenient to assess the REPL.
Ah will try to figure this out, hadn’t looked in ages. I can clone but have difficulty installing. The package should be registered in a few days, then perhaps easier.
This indeed seems good. Interesting that mapreduce(vec, hcat, x) does not hit the same optimisation, and is very slow.