Hi, I am new to Julia and I am trying to get my head around with the Automatic Differentiation in Julia. I have a decent experience with Pytorch Autograd libraries and I wanted to ask if there is an Analog of loss.backward() in Julia?
My specific use case is as follows:
I have a variable y which is given as: y = f(W1, W2, x) where W1 and W2 are parameter matrices of some arbitrary dimensions. I want to calculate the gradient of ‘y’ w.r.t. ‘W1’ and ‘W2’ the same way autograd.grad() or loss.bakward() does in pytorch. Is there any way to accomplish that?