Hello,
I am new to Julia and especially the way Zygote works. For a project in physics informed machine learning I have a neural network NN(x) approximating a function f(x). I want to take the gradient of the neural network and compare it to the derivatives of the function to estimate the loss as the difference between NN’(x) and f’(x) for all x (I know that for a good solution I would compare the values as well, but this serves more as an illustration). I made a small example to show the approach:
using DiffEqFlux, Optim, Flux, Statistics, Zygote
# We want to train a neural network to approximate sin(x)
# by minimizing the loss of the derivatives of the neural network
# to the derivatives of sin(x) AKA cos(x).
xsteps = range(0, 2 * pi, length=50) # Interval
du_true = broadcast(cos, xsteps) # Derivative of sin(x)
# Initialize NN
NN = FastChain(FastDense(1, 12, tanh), FastDense(12, 12, tanh), FastDense(12, 1))
pinit = initial_params(NN)
# Test outside of loop works
du_NN = [gradient(x -> NN(x, pinit)[1], x)[1][1] for x in collect(xsteps)]
# Create loss function
function loss(p)
du_NN = [gradient(x -> NN(x, p)[1], x)[1][1] for x in collect(xsteps)]
mean(broadcast(-, du_true, du_NN))
end
# Train the NN
res = DiffEqFlux.sciml_train(loss,pinit,ADAM(0.01),
maxiters=200)
When I do this I get the error “Mutating arrays are not supported”. I currently lack the intuition on how to solve this. Also, feel free to point out performance errors etc. as I want to learn.