where data is an array of tables an beta is a 1 x J row-vector. The output returned was an array of array of floats. Maybe you can adapt something similar here? If you don’t want to use pmap, try mapslices
maybe using a vector instead of a vector of vectors and reshaping?
function myf(x)
xvec=reshape(x,3,3)
return xvec[:,1]⋅cross(xvec[:,2],xvec[:,3])
end
∇myf = X->ForwardDiff.gradient(myf,X)
x0 = randn(9)
dx = reshape(∇myf(x0),3,3) #here, dx[:,1] == X[1]
if you really (emphasis on really) need an array of arrays, just add this code and ∇f should work with your input, returning an array of arrays of the same form as the input.
function _f(x)
xvec=reshape(x,3,3)
return xvec[:,1]⋅cross(xvec[:,2],xvec[:,3])
end
function ∇f(X)
X2 = reduce((x,y)->cat(x,y,dims=1),X)
∇X = ForwardDiff.gradient(myf,X2)
return [∇X[:,j] for j=1:3]
end
why is this necessary? ForwardDiff accepts functions with only 2 types of input: a real number, and a vector of real numbers. any other structure must be transformed to any of those cases to be differentiated via ForwardDiff.
Like @longemen3000 writes, ForwardDiff.jl needs so see a plain Array. But you can use ArraysOfArrays.jl to reshape things in comfort, and you probably want to use StaticArrays.jl for your inner vectors, since the cross product requires a fixed length anyhow.
Leaving your f(x) unchanged:
using LinearAlgebra
using ForwardDiff
using Random
function f(X)
return X[1]⋅cross(X[2],X[3])
end
Random.seed!(100)
you can do
using ArraysOfArrays, StaticArrays
vv3(X::Matrix) = nestedview(X, SVector{3})
X = rand(3, 3)
∇f = X -> ForwardDiff.gradient(X -> f(vv3(X)),X)
vv3(∇f(X))
vv3(X) will reshape a matrix into a vector of static vectors, but ForwardDiff will see the flat matrix.
Thanks for the suggestions. Eventually, I was able to use Zygote.jl. This was fine for me, as I actually just wanted to verify that a hand coded gradient was correct.