If you ever wanted to learn about how automatic differentiation (AD) works under the hood, check out my latest video explaining various AD concepts in an accessible way.
By the end of this video, you should understand what forward-mode AD, reverse-mode AD, vjp, jvp, pushforward, pullback, tangents and cotangents mean. And you should be able to more effectively identify bottlenecks in your differentiable Julia code and workaround some limitations of AD packages in Julia. In the video, I focus on Julia AD but a lot of the concepts covered are relevant to AD in general.
Last but not least, I show some examples on how to write a differentiable matrix assembly function (for solving PDEs) and how to use the implicit function theorem to define efficient adjoint rules for implicit functions.