Automatic Differentiation with ReservoirComputing.jl Echo State Networks

I’m trying to use auto diff to find the jacobian of an echo state network from ReservoirComputing.jl. So far, I just would like to get the jacobian from the Lorenz forecast tutorial along the trajectory working.

#lorenz system parameters
u0 = [1.0,0.0,0.0]                       
tspan = (0.0,200.0)                      
p = [10.0,28.0,8/3]

#define lorenz system
function lorenz(du,u,p,t)
    du[1] = p[1]*(u[2]-u[1])
    du[2] = u[1]*(p[2]-u[3]) - u[2]
    du[3] = u[1]*u[2] - p[3]*u[3]
#solve and take data
prob = ODEProblem(lorenz, u0, tspan, p)  
data = solve(prob, ABM54(), dt=0.02)   

shift = 300
train_len = 5000
predict_len = 1250

#one step ahead for generative prediction
input_data = data[:, shift:shift+train_len-1]
target_data = data[:, shift+1:shift+train_len]

test = data[:,shift+train_len:shift+train_len+predict_len-1]

res_size = 300
esn = ESN(input_data; 
          reservoir = RandSparseReservoir(res_size, radius=1.2, sparsity=6/res_size),
          input_layer = WeightedLayer(),
          nla_type = NLAT2())

output_layer = train(esn, target_data)
output = esn(Generative(predict_len), output_layer)

using Plots
plot(transpose(output),layout=(3,1), label="predicted")
plot!(transpose(test),layout=(3,1), label="actual")

So far I’ve tried Zygote.jacobian(esn,[1,1,1]) which of course doesn’t work, because Zygote expects a function. I’ve tried a couple of different things, but haven’t managed to figure out how to do this yet. Does anyone have examples I could look at, or know how to find the jacobian in this example?