Tensor operation in Julia

Dear Julia,

What is the best way to rewrite the following code from Python to Julia?

def ctwa(U, B, J, F):
# B shape, (n,m)
# U shape, (n,m)
# J shape, (n,n,m,m)
# F shape, (m,m,m)
BUF = np.einsum(‘jb,jc,abc->ja’,B,U,F)
JU = np.einsum(‘jlbc,lb->jbc’,J,U)
JUUF = np.einsum(‘jbc,jc,abc->ja’,JU,U,F)
return BUF + JUUF

Regards
JV

using TensorOperations

function ctwa(U::Matrix, B::Matrix, J::Array{<:Any,4}, F::Array{<:Any,3})
   @tensor begin
       ...
   end 
end

Dear mkitti
Many hanks for your fast reply:

I have already tried TensorOperation library:

for instance :
BUF = np.einsum(‘jb,jc,abc->ja’,B,U,F)
@tensor SHF[j,a] := B[j,b] * U[j,c] F[a,b,c]

but I got the following error:

ERROR: TensorOperations.IndexError{String}(“non-matching indices between left and right hand side: $(Expr(:(:=), :(var"##278"[j, a]), :((var"##275"[j, b] * var"##276"[j, c]) * var"##277"[a, b, c])))”)

I must match the left and right tensor as follows:

@tensor SHF[i,j,a] := B[i,b] * U[j,c] F[a,b,c]

and it works, but what I want is a matrix like SHF[j,a] not a tensor SHF[i,j,a]

This might be a bug or restriction in TensorOperations …
In any case, there are several alternative libraries available which seem to work:

julia> B = randn(5, 3);

julia> U = randn(5, 4);

julia> F = randn(2,3,4);

julia> using TensorCast

julia> TensorCast.@reduce shf[j,a] := sum(b,c) B[j,b]*F[a,b,c]*U[j,c]
5×2 Matrix{Float64}

julia> using OMEinsum

julia> OMEinsum.@ein shf[j,a] := B[j,b]*U[j,c]*F[a,b,c]
5×2 Matrix{Float64}:

The latter library also provides a numpy-like string macro, i.e., ein"jb,jc,abc->ja"(B, U, F), giving an almost one-to-one translation from Python. Imho, TensorCast is more versatile allowing to mix in non-linear functions and other types of contractions/reductions.

1 Like