Gradient of Neural Network wrapper can't be taken

I use the following wrapper for neural network

nn = Chain(Dense(2, 64, relu), Dense(64, 1))
function wrapper_nn(state)
    q, q_t = state
    dfdx(x,x_t) = ForwardDiff.derivative(x -> nn([x, x_t]), x)
    #qtt = nn([q, q_t]) # works
    qtt = dfdx(q, q_t) #  the other important factor are omitted for simplicity
end

Then I take a gradient of the loss function of that function

data = [1,2]
target = [1]
loss(data, target) = Flux.mean((wrapper_nn(data) .- target) .^ 2)
grads = Zygote.gradient(() -> loss(data, target), params(nn))

And the following error appears

MethodError: no method matching *(::NamedTuple{(:value, :partials),Tuple{Nothing,Array{Float64,1}}}, ::ForwardDiff.Dual{ForwardDiff.Tag{var"#215#220"{Float32},Float32},Float32,1})

What does it mean? I don’t even have * operator in the explicit form