DifferentialEquations - Derivatives in ODE function/ nesting AD

There’s a lot of details on new automatic differentiation libraries that will be released fairly soon. Just as quick spoilers, the ARPA-E DIFFERENTIATE program (ARPA-E DIFFERENTIATE program) funded three Julia Computing projects which led to massive efforts over the last year on new AD mechanisms. This has led to new compiler tooling in Julia, with the vast majority being set to merge in Julia v1.7, which includes flexible compiler passes to be written from user code. This fixes essentially all of the issues we had with Cassette.jl and IRTools.jl (and thus the issues of Zygote.jl), which is why those libraries are somewhat in maintenance mode. The new AD, Diffractor.jl, will get a full announcement soon (so think of this as just the trailer :wink:) with a full explanation of how these issues were solved and what it’s being currently used and tested on.

With this new AD, there are projects starting up in the Julia Lab which will use the new composable pass structure to add features to the AD, like mutation and MPI support, to solve the issues of integrating AD with scientific computing code (since these issues are distinctly different from machine learning code). We’re also teaming up with people who had solved such issues in C++ and Fortran AD tools before, so that we have the right expertise on the team to do it correctly.

Again this has been a big project with lots of moving parts and it’s not complete yet, but you’ll start to hear announcements on it fairly soon.

Mixing forward and reverse almost always makes sense for higher order, limiting to only one or two reverses. The arguments for that can be found in Griewank’s tome IIRC and it has to do with how the complexity grows.

14 Likes