Calculation of Hessian of a loss function w.r.t. dense layer weight matrices

Hi, I am trying to train a dummy Neural Net which is given as: y = transpose(W1)sigmoid(W2x) where W1 and W2 are 10x1 and x is a scalar input. I am using second-order optimization and for that purpose, I need to calculate the hessian. Using the below code generates error:

hess = hessian((W1,W2) → (transpose(W2) * sigmoid.(W1*x))[1], W1, W2)

MethodError: no method matching hessian(::var"#264#265", ::Matrix{Float64}, ::Matrix{Float64})
Closest candidates are:
hessian(::Any, ::Any) at C:\Users\choud.julia\packages\Zygote\4SSHS\src\lib\grad.jl:62

Stacktrace:
[1] top-level scope
@ In[132]:5

I would like to calculate $\Delta_w$y so that I can check for a second order optimization method.