Hcat without copy and stack overflow in type inference

When using packages like ArrayPartition in GitHub - SciML/RecursiveArrayTools.jl: Tools for easily handling objects like arrays of arrays and deeper nestings in scientific machine learning (SciML) and other applications and Hcat in GitHub - JuliaArrays/LazyArrays.jl: Lazy arrays and linear algebra in Julia . I encounter a lot of errors like:

What’ s the recommended way to do hcat without copy for now?

Can you create a minimal reproducer? It could be useful to report a bug on the Julia Github.

EDIT: that said, I don’t think your issue can be fixed at that end, considering your tuples are truly huge.

Can you try to materialize the lazy array every so often, before the tuples become so huge? I think it simply doesn’t make sense to create tuples so large.

using RecursiveArrayTools
using ShiftedArrays
ap = ArrayPartition([randn(3641) for i in 1:1000]...)

function procArr(arr1::AbstractArray)
     s = ShiftedArray(arr1, (1, 0))

Something like this. It seems LazyStack works for this. Would it be better if those packages give an error on long input instead?