Tensor contractions very inefficient in julia

I am new to julia (just few days) and I want to shift from python to julia by rewriting my DMRG code in julia. For Tensor contractions in python I was using np.tensordot, I searched for equivalent packages in julia and tried NCon, NDTensors etc. Unfortunately all of them are several orders of magnitude slower than np.tensordot. While it is true that the loops and function executions are faster in julia compared to python, In a tensor network code the tensor contractions are the one that dominates other operations. I was wondering if there are anyone here implementing tensor network algorithms in julia and what pakage they use for efficient tensor contractions.

Have you tried out Tullio? GitHub - mcabbott/Tullio.jl: â…€ It has tensor contractions that often rival BLAS speed.

5 Likes

Not yet, Actually I started contractions just this morning and all day I have been frustrated and thinking of going back to python. I will check your suggestion. Thank you very much.

“Several orders of magnitude” sounds like you might be doing something wrong. If you post a minimal working benchmark of interest to you, it will be easier to help you.

7 Likes

for example executing this code:

using TensorOperations
A = rand(2,3,4)
B = rand(5,3,6)
@time @tensor C[a1,a3,b1,b3] := A[a1,a2,a3]*B[b1,a2,b3]

gives us
4.724015 seconds (26.08 M allocations: 1.252 GiB, 9.18% gc time)

where as the python equivalent is:

import numpy as np
import time

s = time.perf_counter()
A = np.random.rand(2,3,4)
B = np.random.rand(5,3,6)
C = np.tensordot(A,B, axes = [1,1])
f = time.perf_counter()
print('time taken = ',f-s)

which yields

time taken = 0.02150451199850067

The package TensorOperations seems to be widely used in physics community and was suggested to my in a recent winter school. Other packages like NCon and NDTensors also give similar results.

Have you tried timing that a second time? First thing to learn when comparing Julia to anything else is that, the first time you run a Julia function or expression, it’s being compiled and @time includes the compilation (and inference and code generation and …) time.

Also, there are DMRG implementations (and much more) available at GitHub - ITensor/ITensors.jl: A Julia library for efficient tensor computations and tensor network calculations and https://github.com/maartenvd/MPSKit.jl .

6 Likes

You are measuring the compilation time there. Run it twice or, better, use the BenchmarkTools package.

julia> @time @tensor C[a1,a3,b1,b3] := A[a1,a2,a3]*B[b1,a2,b3];
  0.000087 seconds (85 allocations: 7.516 KiB)


5 Likes

Thank you all, I didn’t realize that @time includes the compilation time too. Like leandromartinez98 suggested I used @btime using the BenchmarkTools and the run time is actually in microseconds.

3 Likes

One of the big improvements Julia 1.6 adds is that @time will output what percent of a timing was caused by compile time. Running your example, it shows
4.457462 seconds (22.67 M allocations: 1.217 GiB, 10.42% gc time, 99.66% compilation time)

7 Likes

Thank you. Yes, I am aware of those packages but I wanted to write my own from the scratch as it is more flexible in the long run. Plus I am also writing time evolution for long range Hamiltonian using TDVP and also other algorithms (I have everything written in python already and just want to rewrite it in julia)

These are probably all things that are also included in MPSKit.jl , together with tensors with arbitrary (abelian or nonabelian) symmetries. But it’s certainly also very instructive to write your own code. Nonetheless, you might want to check out or be interested in TensorKit.jl as a package that provides a useful set of primitives for writing general tensor network algorithms.

1 Like

Take a look here (and on that whole page. A lot of immediately useful information is there).

[Performance Tips · The Julia Language](Performance Tips · The Julia Language)

Well, I cannot link to the section I wanted, but it is there.

1 Like

Here’s a direct link link to the docs section.

1 Like

You also have
https://github.com/chakravala/Grassmann.jl

2 Likes