[ANN] Tensorial.jl: Statically sized tensors and related operations for Julia

This is arguably a slightly strange choice, maybe the only function in LinearAlgebra which regards arrays as a bag of numbers, ignoring their structure. And begin recursive seems the wrong behaviour on e.g. vectors of matrices, to me (and wasn’t seriously discussed when added?). Nevertheless, it’s too late to change this, and least confusing if the entire ecosystem agrees.

That seems fine to me. It’s typed \boxdot<tab>\_2<tab>. We could add this to TensorCore.jl too… like the single contraction it’s always reshape-then-*, i.e. it allows rand(2,3,4) ⊡₂ rand(3,4,5) but not rand(2,3,4) ⊡₂ rand(4,3,5):

julia> x, y = Tensorial.@Tensor(rand(2,3,4)), Tensorial.@Tensor(rand(3,4,5));

julia> Tensorial.contract2(x, y)
2×5 Mat{2, 5, Float64, 10}:
 2.38424  1.98448  4.23119  2.87129  2.06991
 2.73175  2.26099  4.28346  3.05636  2.02472

julia> ans ≈ reshape(x, 2,:) * reshape(y, :,5)
true

Maybe worth noting that Enzyme appears to just work, by understanding that these are just tuples, without ever calling similar. DI seems able to use that:

julia> Base.similar(_::typeof(Vec(5.0,6.0)), ::Type{Float64}, ::Base.Dims) = error("similar not defined")

julia> Enzyme.gradient(Reverse, v -> dot(v, Mat{2,2}(1,2,3,4), v), Vec(5.0,6.0))[1]
2-element Vec{2, Float64}:
 40.0
 73.0

julia> DifferentiationInterface.gradient(v -> dot(v, Mat{2,2}(1,2,3,4), v), AutoEnzyme(), Vec(5.0,6.0))
2-element Vec{2, Float64}:
 40.0
 73.0
3 Likes

Yes, very long time ago, I tried to develop Tensor as a subtype of StaticArray. But, if I remember correctly, I couldn’t account for tensor symmetries without modifying the internal code of StaticArrays.jl, so I eventually gave up. I understand your situation, but may I ask why your function needs to handle both StaticArrays.jl and Tensors.jl in the first place?

Yes, this is exactly what I’m doing in Tensorial.jl. Maybe we should handle the n-th contraction as boxdot(A, B, nth) in TensorCore.jl. If you agree, I’ll submit a PR.

1 Like

I’m writing a FEM code, where it makes sense to store vectors as Vecs to make use of Tensors.jl for constitutive models. However, there are parts of the code which are specialized algorithms (polygon clipping) which could be used in other contexts as well. It would make sense to make these functions agnostic of whether the input is SVectors or Vecs, as long as they are static and support the same basic operations.

I think you can define a Union type once and then use it across all functions.

const SVectorOrVec{L,T} = Union{SVector{L,T}, Vec{L,T}}
3 Likes

A post was split to a new topic: Compile FastDifferentiation derivatives one time only