Gradient of gradient

PyTorch cannot backpropagate through mutations and neither can Zygote. The expression fill!(similar(y), 1) depends on x through y and mutates its arguments (see the exclamation mark). You know that there is no real dependency on the value of y because the outcome is constant but Zygote will still try to differentiate through it. So you should rewrite it without mutations, for example

function pred(x, net)
    y, pullback = Zygote.pullback(net, x)
    grads = pullback(ones(size(y)))[1]
    return grads
end