Automatic differentiation in Julia for gradient computation

Hi, I am new to Julia and I am trying to get my head around with the Automatic Differentiation in Julia. I have a decent experience with Pytorch Autograd libraries and I wanted to ask if there is an Analog of loss.backward() in Julia?

My specific use case is as follows:
I have a variable y which is given as: y = f(W1, W2, x) where W1 and W2 are parameter matrices of some arbitrary dimensions. I want to calculate the gradient of ‘y’ w.r.t. ‘W1’ and ‘W2’ the same way autograd.grad() or loss.bakward() does in pytorch. Is there any way to accomplish that?

Hello and welcome!

There are lots of reverse-mode AD libraries in Julia, you could try your hand with Zygote.jl which is the AD used in the DL library Flux.jl

using Zygote

Zygote.gradient((W1, W2) -> f(W1, W2, x), W1, W2)

This creates a closure (anonymous function) (W1, W2) -> f(W1, W2, x) that closes over the variable x, so that you only compute the gradient w.r.t. W1,W2.

3 Likes