Just learned that Julia arrays are column major, I have several questions regarding this.
Is this true?
Is the memory buffer for a julia array always continuous or is there a way to check?
If arrays are column major, then what’s the preferred way to work with GPU or deep learning frameworks such as ONNX or TensorRT, doing transform whenever needed or is there some other ways?
Of course, if you wrap an Array in a lazy transpose (via A', LinearAlgebra.Transpose(A), PermutedDimsArray(A, (2,1)), or something similar), then it will become row-major and the reverse holds.
If the algorithm truly is orientation agnostic, at most you’ll have to flip the result (or change your interpretation of the result), as the algorithm will still work the same way, just interpreting our columns as rows internally.
If an algorithm isn’t, transposing it explicitly beforehand is usually rather trivial though.