I’m trying to create a tensor regression model in Julia. I first tried in PyTorch but it was becoming a hassle so thought I’d give Julia a try, and its looking like Julia can’t do it at all (at least in any efficient way).

A tensor regression is just a series of tensor contractions (“generalized matrix multiplications”), where the tensors (multi-dimensional arrays) act as trainable parameters. I want to train these parameter tensors using gradient descent as I would with a normal neural network (which is a series of matrix-matrix multiplications with interspersed non-linear functions). The problem is TensorOperations.jl, the major library that supports tensor contractions, does not work with either of Julia’s main ML libraries, Knet.jl and Flux.jl. Both of the latter libraries secretly convert your Array types into special trackable types so they can keep track of gradients, but then TensorOperations.jl can’t handle those types.

I tried Einsum.jl and it technically works with Flux.jl, but only with outrageously bad performance. It took literally 2 minutes to do a very small tensor contraction with Flux tracked arrays (takes 1.3 seconds with normal arrays).

I’m picking up Julia again after trying it back at v0.3 so I may just be missing something here. Any help appreciated.