Hi,
I am trying to do some operations on some large tensor (roughly 2^(24)-2^(26) complex elements). I need to use reshape and permutedims frequently. Considering the large size of the tensor I consider, I am a little worried about the memory allocation. My understanding is that when I use reshape it still points to the original data based on julia - How to reshape Arrays quickly - Stack Overflow. However, it seems that when I use permutedims, a copy is created and this leads to extra memory allocation. Is there any way to permute dimensions without causing extra memory allocation?
Thanks