I am about 2 days into Julia, and I would like to implement some simple operations related to tensor contraction as a learning task. It looks like TensorOperations should help, and for 2x2 tensors I can successfully evaluate a double dot product to get the expected scalar output:
uu = [ 1 2 ; 3 4 ]
vv = [ -1 -1 ; -1 -1]
using TensorOperations
@tensor begin
a := scalar(uu[a,b]*vv[b,a])
end
print("a= ",a,"\n")
However, my specific tasks requires using rank 1 vectors, and I cannot get this to work. To illustrate the general idea, I tried:
using TensorOperations
pp = [ 1 ; 2 ; 3]
aa = [ -1 -2 -3]
@tensor begin
cc[a,c,1] := pp[a]*aa[c,1]
dd := scalar(aa[a]*pp[a,1])
end
print("cc=",cc,"\n")
print("dd=",dd,"\n")
The first result (outer product) to get cc works, but I cannot get the inner product to work. The questions are:
Why do I need the 1 in the index list for aa and cc for the calculation of cc?
How can I get the result with a vector? Is there a better more Julia-like approach?
Neither of these solutions is great, but…
Creates a scalar, via taking a view or using vec to flatten aa.
@tensor begin
cc[a,c,1] := pp[a]*aa[c,1]
dd := scalar(@view(aa[:,])[a]*pp[a])
end
@tensor begin
cc[a,c,1] := pp[a]*aa[c,1]
dd := scalar(vec(aa)[a]*pp[a])
end
Stores the result in a vector named dd, but this is explicitly what you weren’t asking for:
@tensor begin
cc[a,c,1] := pp[a]*aa[c,1]
dd[b] := aa[b,a]*pp[a]
end
Btw, aa is rank 2 (a 1x3 matrix). If you want column and row vectors:
Thanks - this is very helpful and helps eliminate some of my confusion. I am still trying to understand though what the 1 does in the list of indices in lines like:
cc[a,c,1] := pp[a]*aa[c,1]
I did discover that I could get my inner product to work correctly by
using TensorOperations
pp = [ 1 2 3]
aa = [ -1 -2 -3]
@tensor begin
dd := scalar(aa[1,b]*pp'[b,1])
end
print("dd=",dd,"\n")
numbers and letters are equally treated as valid index identifiers, so there is no difference between the role of a, c and 1 within those square brackets. I think your main source of confusion comes from specifying vectors as pp = [ 1 2 3] , which makes them row vectors, actually row matrices, i.e. two-index objects with dimension one having length 1 and dimension 2 having length 3. A simple (column) vector in julia can be entered as pp = [1, 2, 3], which is a single index object with length 3. So you could do
using TensorOperations
pp = [ 1, 2, 3]
aa = [ -1, -2, -3]
@tensor begin
inner := scalar(aa[b]*conj(pp[b])
outer[a,b] := aa[a] * conj(pp[b])
end
Also, think of this @tensor notation as really being what you would write in index notation. Preferably, don’t write pp'[b] but ratther conj(pp[b]).
This is very helpful - the example helps me to wrap my head around the expected initialization of the simple vectors and the inner & outer products. Just added another “)” at the end of the line setting inner