Hi everyone,

### [Edit] Question 1:

I had a quick question regarding the usage of `TensorOperations.jl`

. How would one go about broadcasting certain operations such as

```
using TensorOperations;
function g(F)
@tensor F[i,j] * F[i,j];
end
```

which works if I have `F`

as two dimensional (say of shape `(3,3)`

). What if I want to broadcast this over a `n`

-dimensional array, say of shape `(3,3,100,4,...)`

? In `NumPy`

I would do something like:

```
from numpy import einsum
einsum("ij...,ij...", F, F)
```

where the ellipsis (`...`

) controls the broadcasting.

I have checked `Tullio.jl`

too, but I couldn’t see anything. Any pointers would be helpful.

Thanks

### Edit:

I did find `EllipsisNotation.jl`

allows one to index into an array `F[1,1,..]`

, but not directly of use in this case.

### Question 2:

The end goal is to be able to differentiate through `g`

using `ForwardDiff`

and broadcast the result. This brings me to my next question. Say I have this function:

```
f(u) = @. 1. + u[1]^2 + u[2]^2
```

I would take it’s gradient as

```
g = x-> gradient(f, x)
v = rand(2)
@assert g(v) == 2. .* v #true
```

Now what if my input to `f`

is higher dimensional say of shape (2,100,4). How would I take broadcast it’s gradient over the trailing axes? I have taken a look at the posts in discourse, but if you feel this has been answered, feel free to point

Many thanks!