JuliaDiff and MPI

I’m interested in implementing both of JuliaDiff (automatic differentiation) and MPI.

MPI needs domain decomposition, but I’m not sure JuliaDiff can deal with domain decomposition.
And I can’t find good documents about this.

Would someone know how to implement automatic differentiation and MPI simultaneously?

Thank you.

1 Like

i don’t know if with reverse differentiation is this possible, but it can be done with forward differentiation, if we are talking about gradients.
The basis is the dual number:

x1 = ForwardDiff.Dual(3.0,1.0) # Dual{Nothing}(3.0,1.0)

evaluating x on a function, returns a dual:

f(x) = 2x
dfdx = f(x1) #Dual{Nothing}(6.0,2.0)
dfdx.partials[1] # 2.0

every number has a tag to stop perturbation confusion (dx is different from dy). the tag goes on the type parameter (in my case is nothing).
A gradient is the evaluation on a vector of duals, if you evaluate a function dual by dual, or by chunks, you can distribute that, i think

1 Like

what ForwardDiff.jl does is automate this dual evaluation behind convenience functions, performing tagging, evaluation and just giving you the final result. but the basis is the use of Dual Numbers

2 Likes

This is what I wanted to know.
Thank you very much! I really appreciate it.