# Tensorial.jl

I’d like to announce Tensorial.jl, a package I’ve developed and actively use in my research. Tensorial.jl provides statically sized `Tensor`

similar to `SArray`

from StaticArrays.jl, but with several additional features designed to make tensor operations more convenient. These include:

- Contraction, tensor product (
`⊗`

), and a flexible`@einsum`

macro for Einstein summation convention - A
`@Symmetry`

macro to define tensor symmetries, which eliminates unnecessary calculations - Automatic differentiation via
`gradient`

and`hessian`

functions, leveraging ForwardDiff.jl

## Compatibility with `AbstractArray`

Based on advice from @ExpandingMan, I’ve recently made the package compatible with `AbstractArray`

. So, I believe people can easily switch from `SArray`

to `Tensor`

by simply replacing `SArray`

, `SMatrix`

, and `SVector`

with `Tensor`

, `Mat`

, and `Vec`

, respectively. To check compatibility, @ExpandingMan has kindly contributed—thank you!

## Comparison with other tensor packages

While there are several packages available for tensor operations, Tensorial.jl is specifically designed for working with small-sized tensors. The most similar package in this regard is Tensors.jl, which keeps its internal code simple by restricting tensor size. In contrast, I sought to overcome this limitation, and now Tensorial.jl can handle tensors of arbitrary sizes, offering greater flexibility for a wider range of applications.

## Quick start

```
julia> using Tensorial
julia> x = Vec{3}(rand(3)); # constructor similar to SArray
julia> A = @Mat rand(3,3); # @Vec, @Mat and @Tensor, analogous to @SVector, @SMatrix and @SArray
julia> A ⋅ x ≈ A * x # single contraction (⋅)
true
julia> A ⊡ A ≈ tr(A'A) # double contraction (⊡)
true
julia> x ⊗ x ≈ x * x' # tensor product (⊗)
true
julia> (@einsum x[i] * A[j,i] * x[j]) ≈ x ⋅ A' ⋅ x # Einstein summation (@einsum)
true
julia> S = rand(Tensor{Tuple{@Symmetry{3,3}}}); # specify symmetry S₍ᵢⱼ₎
julia> SS = rand(Tensor{Tuple{@Symmetry{3,3}, @Symmetry{3,3}}}); # SS₍ᵢⱼ₎₍ₖₗ₎
julia> inv(SS) ⊡ S ≈ @einsum inv(SS)[i,j,k,l] * S[k,l] # it just works
true
julia> δ = one(Mat{3,3}) # identity tensor
3×3 Tensor{Tuple{3, 3}, Float64, 2, 9}:
1.0 0.0 0.0
0.0 1.0 0.0
0.0 0.0 1.0
julia> gradient(identity, S) ≈ one(SS) # ∂Sᵢⱼ/∂Sₖₗ = (δᵢₖδⱼₗ + δᵢₗδⱼₖ) / 2
true
```