Differential operators: curl div, grad, laplacian, partial derivatives with automatic differentiation

Hello,

Is there a package that implements the operators curl div, grad, laplacian, partial derivatives with automatic differentiation for scalar functions and vector fields?

1 Like

No but the AD primatives of jvps and vjps are directional derivatives and gradients, so you can build these from it. This would be nice to add to DiffEqFlux since we use this quite a bit in physics-informed neural nets.

1 Like

I could not find documentation on the functions jvps and vjps. Can you provide me an example how to compute one partial derivative for a function f(x,y,z) = x*y*z for instance?

The easiest way to do this might be to use

and then:

auto_jacvec(f, u, [1,0,0])

would be partial f(u) / partial x.

2 Likes

My notes on jvps and vjps for autodiff might be helpful here. The relevant part starts on Lecture 9:

https://mitmath.github.io/18337/lecture9/autodiff_dimensions

Reverse mode and those are the next parts in https://github.com/mitmath/18337.

4 Likes

Grassmann.jl almost supports, there are a few more bugs I need to fix for automatic differentiation, but I’m not currently in a rush to do that right now. I just work on it for fun on my free time when I feel like it, but it will be able to do that once I finalize the AD feature pallette.

It won’t be limited to scalar functions and vector fields, you’ll also be able to apply it to more general tensor fields.

2 Likes

Thank you for your answer, this is very helpful!

is there now an AD laplacian operator implemented somewhere?

No, it really needs Diffractor.

Thanks