FiniteDiff.jl is a new Julia library for fast gradients, Jacobians, and Hessians which supports sparsity and GPUs!
Okay, I lied, it’s not an entirely new library. It’s the next incarnation of what was known as DiffEqDiffTools, but the library essentially had nothing DiffEq left in it so it was moved over to JuliaDiff and turned into FiniteDiff.jl. While Calculus.jl also can give you Jacobians and Hessians, FiniteDiff.jl allows you to use non-allocating mutating forms by pre-allocating caches, and is compatible with StaticArrays. You can use SparseDiffTools (now moved to JuliaDiff) to get color vectors and specialize the calculations on sparsity patterns.
But the most special part of FiniteDiff.jl is that it can utilize special matrix types to build fast differencing algorithms on them. For example, if we have a BlockBandedMatrix from BlockBandedMatrices.jl we can have the Jacobian calculation optimized on this exact type in terms of differentiation directions and iteration schemes. For example, from a system of PDEs we can do:
function pde(out, x)
x = reshape(x, 100, 100)
out = reshape(out, 100, 100)
for i in 1:100
for j in 1:100
out[i, j] = x[i, j] + x[max(i -1, 1), j] + x[min(i+1, size(x, 1)), j] + x[i, max(j-1, 1)] + x[i, min(j+1, size(x, 2))]
end
end
return vec(out)
end
x = rand(10000)
using FillArrays, BlockBandedMatrices
Jbbb = BandedBlockBandedMatrix(Ones(10000, 10000), fill(100, 100), fill(100, 100), (1, 1), (1, 1))
colorsbbb = ArrayInterface.matrix_colors(Jbbb)
bbbcache = FiniteDiff.JacobianCache(x,colorvec=colorsbbb,sparsity=Jbbb)
FiniteDiff.finite_difference_jacobian!(Jbbb, pde, x, bbbcache)
While a naive Jacobian calculation like that in FiniteDifferences.jl or Calculus.jl would:
- Allocate
- Iterate the matrix as a dense matrix (ouch!)
- Use 10000 calls to
pde
This one only allocates the cache
once and is non-allocating in all proceeding Jacobian calculations, specializes its iteration scheme on the matrix, and only uses 10 pde
calls to fill the entire Jacobian. Did I hear a “zoom zoom”? . I don’t think I know of packages in other languages that not only support sparse matrices, but structured matrices as well.
Special thanks to @YingboMa @oxinabox @dlfivefifty Langwen Pankaj @dextorious who have all been fundamental in this project, as well as @pkofod who was an early adopter in NLsolve.jl/Optim.jl. Special thanks to @jlperla who has helped make sure this project reached these goals.
Our next steps is to add sparse Hessian support. If you find this project useful, please star it. Let’s move all of the main numerical Julia packages away from the older allocating Calculus.jl and start zooming .
Downstream Support
Quick note on library authors who want to start adding downstream sparsity support. All you really need from the users is colorvec
and sparsity
arguments. sparsity
is the sparsity pattern of the user’s function (or it can be automatically generated using SparsityDetection.jl) and colorvec
is the color vector, usually generated by SparseDiffTools.jl. DifferentialEquations.jl’s interface on sparsity is thus, “give me your sparsity
pattern and your colorvec
and I’ll specialize the internal Jacobian calculations on this!”. NLsolve.jl, Optim.jl, etc. can all support this just by adding these two arguments and forwarding them to these functions. I hope all of Julia makes use of this and we supported sparsity everywhere soon. 2020: the year of Julialang sparsity.
Also note the SpraseDiffTools has a version of forward-mode AD which also uses those same two arguments, so it’s an extension of ForwardDiff that gives you sparse ForwardDiff with sparsity
and colorvec
.