Gradient of NN with respect to inputs

I want to take a gradient of NN but get the error

MethodError: no method matching extract_gradient!(::Type{ForwardDiff.Tag{Chain{Tuple{Dense{typeof(relu),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}}}},Int64}}, ::Array{Array{ForwardDiff.Dual{ForwardDiff.Tag{Chain{Tuple{Dense{typeof(relu),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}}}},Int64},Float32,2},1},1}, ::Array{ForwardDiff.Dual{ForwardDiff.Tag{Chain{Tuple{Dense{typeof(relu),Array{Float32,2},Array{Float32,1}},Dense{typeof(identity),Array{Float32,2},Array{Float32,1}}}},Int64},Float32,2},1})

Here is my code

function build_model(x)
    nn = Chain(Dense(2, 64, relu), Dense(64, 1))
    ∂f_∂x(input) = ForwardDiff.gradient(nn, input)
    println(∂f_∂x(x))
 end
x_vec = Vector([1,2])
build_model(x_vec)

Currently, I’m using jacobian instead of gradient because of a vector output, and it works