FiniteDiff.jl: Fast (Sparse) Gradients, Jacobians, Hessians!

FiniteDiff.jl is a new Julia library for fast gradients, Jacobians, and Hessians which supports sparsity and GPUs!

Okay, I lied, it’s not an entirely new library. It’s the next incarnation of what was known as DiffEqDiffTools, but the library essentially had nothing DiffEq left in it so it was moved over to JuliaDiff and turned into FiniteDiff.jl. While Calculus.jl also can give you Jacobians and Hessians, FiniteDiff.jl allows you to use non-allocating mutating forms by pre-allocating caches, and is compatible with StaticArrays. You can use SparseDiffTools (now moved to JuliaDiff) to get color vectors and specialize the calculations on sparsity patterns.

But the most special part of FiniteDiff.jl is that it can utilize special matrix types to build fast differencing algorithms on them. For example, if we have a BlockBandedMatrix from BlockBandedMatrices.jl we can have the Jacobian calculation optimized on this exact type in terms of differentiation directions and iteration schemes. For example, from a system of PDEs we can do:

function pde(out, x)
	x = reshape(x, 100, 100)
	out = reshape(out, 100, 100)
	for i in 1:100
		for j in 1:100
			out[i, j] = x[i, j] + x[max(i -1, 1), j] + x[min(i+1, size(x, 1)), j] +  x[i, max(j-1, 1)]  + x[i, min(j+1, size(x, 2))]
		end
	end
	return vec(out)
end
x = rand(10000)
using FillArrays, BlockBandedMatrices
Jbbb = BandedBlockBandedMatrix(Ones(10000, 10000), fill(100, 100), fill(100, 100), (1, 1), (1, 1))
colorsbbb = ArrayInterface.matrix_colors(Jbbb)
bbbcache = FiniteDiff.JacobianCache(x,colorvec=colorsbbb,sparsity=Jbbb)
FiniteDiff.finite_difference_jacobian!(Jbbb, pde, x, bbbcache)

While a naive Jacobian calculation like that in FiniteDifferences.jl or Calculus.jl would:

  • Allocate
  • Iterate the matrix as a dense matrix (ouch!)
  • Use 10000 calls to pde

This one only allocates the cache once and is non-allocating in all proceeding Jacobian calculations, specializes its iteration scheme on the matrix, and only uses 10 pde calls to fill the entire Jacobian. Did I hear a “zoom zoom”? :oncoming_automobile: :rocket:. I don’t think I know of packages in other languages that not only support sparse matrices, but structured matrices as well.

Special thanks to @YingboMa @oxinabox @dlfivefifty Langwen Pankaj @dextorious who have all been fundamental in this project, as well as @pkofod who was an early adopter in NLsolve.jl/Optim.jl. Special thanks to @jlperla who has helped make sure this project reached these goals.

Our next steps is to add sparse Hessian support. If you find this project useful, please star it. Let’s move all of the main numerical Julia packages away from the older allocating Calculus.jl and start zooming :slight_smile:.

Downstream Support

Quick note on library authors who want to start adding downstream sparsity support. All you really need from the users is colorvec and sparsity arguments. sparsity is the sparsity pattern of the user’s function (or it can be automatically generated using SparsityDetection.jl) and colorvec is the color vector, usually generated by SparseDiffTools.jl. DifferentialEquations.jl’s interface on sparsity is thus, “give me your sparsity pattern and your colorvec and I’ll specialize the internal Jacobian calculations on this!”. NLsolve.jl, Optim.jl, etc. can all support this just by adding these two arguments and forwarding them to these functions. I hope all of Julia makes use of this and we supported sparsity everywhere soon. 2020: the year of Julialang sparsity.

Also note the SpraseDiffTools has a version of forward-mode AD which also uses those same two arguments, so it’s an extension of ForwardDiff that gives you sparse ForwardDiff with sparsity and colorvec.

45 Likes

This is super cool! I’m not the target group for this, but the fact that this is even possible justified an instant star. Zoom zoom

1 Like

This is great, looking forward to trying out sparse structured Hessians for hierarchical Bayesian models!

1 Like

We’ll be switching over this week. Thanks.

Awesome stuff as always @ChrisRackauckas. Congrats to you and your team.

1 Like

Hey Chris,

Thanks for this package. I wanted to try it out, but I get the following error message. Let me know whether I should open an issue on Github instead of cluttering up the thread here.

Would you happen to have any ideas? I am using Julia 1.1.0 and Juno 0.7.2. Is my Julia version the issue here? Thanks in advance.

ERROR: Unsatisfiable requirements detected for package FiniteDiff [6a86dc24]:
 FiniteDiff [6a86dc24] log:
 ├─possible versions are: 2.0.0 or uninstalled
 ├─restricted to versions * by an explicit requirement, leaving only versions 2.0.0
 └─restricted by julia compatibility requirements to versions: uninstalled — no versions left

It requires Julia 1.2 because of BlockBandedMatrices.jl stuff.

Just to be sure, in your example it seems like bbbcache is never used. Could this perhaps be a typo in the last line?

1 Like

Yeah typo fixed.

This is great! Just last month I was wishing for a package that did exactly this (coloring with AD). Our use is optimization and SparseDiffTools.jl sped up some of our Jacobians significantly. Thanks!

1 Like