This is my neural network:

```
net = Chain(Dense(6, 16, relu), Dense(16, 16, relu), Dense(16, 8))
```

Consider that I have a function like:

```
f(x) = sum(net(x))
```

If I want to stop taking gradient of `f(x)`

with respect to `x`

, I only need to run this line of code:

```
Zygote.@nograd f
```

How could I stop taking gradient of `f(x)`

with respect to weights and biases of `net`

?