Help with converting Numpy einsum to Julia

I want to convert this particular einsum expression to Julia. What is the most Julia way to do this?

Python code

import numpy as np
x = np.arange(2*3*4*5*4).reshape(2, 3, 4, 5, 4)
np.einsum('ijklk->ijkl', x[:,:,1,:,:,1,:])[:] += <some_appropriate_tensor>

The problem that I face is that I want the addition to affect x directly like in the case of using einsum. I am not able to recreate the same functionality with views because the einsum notation is complex for me. What would be that best way to go about converting this to Julia?
Any advice with understanding the einsum notation is also welcomed. Any help would be appreciated and thanks in advance.

1 Like

The simplest way would be to also use Einstein notation in Julia, for example with TensorOperations.jl or Tullio.jl.

Indices that appear in multiple right-hand-side arrays have the associated operator applied elementwise.
Indices that appear on the right but not the left are contracted over.

For example, matrix multiplication is

using TensorOperations
@tensor C[m,n] := A[m,k] * B[k,n]

The k dimensions of A and B are multiplied elementwise, and then (because k doesn’t appear in the answer), it is summed over.
Use = instead of := if you’re updating an existing C.

1 Like

If I change your python code to something which runs:

import numpy as np
x = np.arange(2*3*4*5*4).reshape(2,3,1,4,5,1,4)
y = np.einsum('ijklk->ijkl', x[:,:,0,:,:,0,:]) 
y.shape # (2, 3, 4, 5)
y[0] = 999
x

then the most literal translation looks like this:

x = reshape(0:2*3*4*5*4-1, (2,3,1,4,5,1,4))

using OMEinsum # this package has a string macro giving numpy-like notation:
y1 = ein"ijklk->ijkl"(x[:,:,1,:,:,1,:]);
size(y1) # (2, 3, 4, 5)

@ein y2[i,j,k,l] := x[i,j,1,k,l,1,k]; # @einsum, @tullio, @tensor use this notation
y1 == y2

However (unlike python) these are not views of the original x, they are new arrays. (Even though there was no summation here, as all indices on the right also appear on the left.) You can make such a view but it’s not entirely straightforward. Perhaps you can just write directly into x?

1 Like

Thanks for the reply. Also thanks for noticing the wrong indexing in python code and correcting it.

I have a piece of code for creating the view but I am not entirely sure how to write into x directly.

function _einsum(a) #'ijlml->ijlm'
    l,m,l,j,i = size(a)
    p = zeros(m,l,j,i)
    for i=1:i, j=1:j, l=1:l
        p[:,l,j,i] = a[l,:,l,j,i]
    end
    return p
end

Can you please guide what changes are to be made to add into x directly? Thanks in advance

Thanks for your response. The problem with using them is that I heard from a colleague that these libraries don’t scale well. Also, if direct code can be written it would be better, because it would be like importing an entire package just for a single einsum.

The loop you’ve written is almost identical to what Einsum would do, with the exception that it would want to iterate over the index marked :

julia> using Einsum, MacroTools                                                    

julia> expr = @macroexpand1 @einsum y2[i,j,k,l] := x[i,j,1,k,l,1,k];               

julia> MacroTools.prettify(expr)                                                   
quote
    local lemur = promote_type(eltype(x))
    y2 = Array{lemur}(undef, size(x, 1), size(x, 2), size(x, 4), size(x, 5))
    @assert size(x, 4) == size(x, 7)
    let i, j, k, l
        @inbounds for l = 1:size(x, 5)
                for k = 1:size(x, 4)
                    for j = 1:size(x, 2)
                        for i = 1:size(x, 1)
                            y2[i, j, k, l] = x[i, j, 1, k, l, 1, k]
                        end
                    end
                end
            end
    end
    y2
end

The result is not a view, as you can check by writing p[1]=999 and confirming that a is unchanged. What is += <some_appropriate_tensor> exactly?

I am extremely sorry. I didn’t mean to say view (I meant to say the slice - not the view).
By <some_appropriate_tensor> I meant to say a tensor of appropriate dimensions (lazy me didn’t want to write a tensor of that dimensions). Thanks again, for your response.

Can you please tell how to add into x directly for this particular einsum?

julia> x = collect(reshape(0:2*3*4*5*4-1, (2,3,1,4,5,1,4))); # now this is an Array

julia> ismutable(x)
true

julia> x[1:5]'
1Ă—5 Adjoint{Int64,Array{Int64,1}}:
 0  1  2  3  4

julia> @einsum y3[i,j,k,l] := x[i,j,1,k,l,1,k]; # as before, := makes a new array

julia> y3 .= rand.(Int8);

julia> y3[1:5]'
1Ă—5 Adjoint{Int64,Array{Int64,1}}:
 126  -11  -65  83  93

julia> @einsum x[i,j,1,k,l,1,k] += y3[i,j,k,l]; # with = or += this alters x

julia> x[1:5]'
1Ă—5 Adjoint{Int64,Array{Int64,1}}:
 126  -10  -63  86  97

These packages differ a little in what they allow, @ein only supports := I think.

1 Like