I have some tensor contractions, e.g., A[abcd] B[cdef] = C[abef], where A[abcd] = A[badc] and B[cdef] = B[dcfe]. I did not find information about symmetry in einsum in NumPy. Is there any Julia package can handle tensor contractions with symmetries (possibly calculations will be faster)?
I found Introduction · TensorKit.jl
mentions some symmetries and examples in Tutorial · TensorKit.jl
Is the performance comparable with einsum/opt-einsum in NumPy?
Yes, TensorKit.jl can be used for this. There is also ITensors.jl which supports abelian symmetries. It uses a different convention for tensor contractions, i.e. not Einsum or NCON notation.
Performance of both should be good, though a detailed benchmark comparing different packages and implementations is missing.
I hope the performance is a prefactor of symmetry speed up than numpy (using C load BLAS?) Maybe I will do some benchmark, sometime, or better see some existing result.
I am not sure I understand your question. Both TensorKit.jl and ITensors.jl use BLAS under the hood to perform the contraction, but also exploit the symmetry to end up with smaller effective matrices that need to be multiplied. Without symmetries, the performance of TensorKit.jl or TensorOperations.jl (which just works with native Julia arrays) is on par or possibly even better than that of einsum in Python.