Laplace approximation for mixed models in Julia (TMB-like functionality)

Welcome Cole! I gave that talk, and can answer a few of your questions.

All the pieces mentioned in that previous TMB thread are still in place, and we have a few more as well. I’m not sure about the status of higher-order AD, but as to the other pieces:

  • Hessian sparsity detection sparse differentiation: sort of there, but not quite. My impression (maybe @ChrisRackauckas can elaborate) is that it wouldn’t take too much effort to implement, it just needs attention from someone with the time and domain knowledge.

  • Laplace approximation: given an optimizer and an AD system, this is in principle just three lines of code (assuming f is your log-density function and N is the number of variables):

using Optim, ForwardDiff, LinearAlgebra
opt = optimize(f, zeros(N), LBFGS(), autodiff=:forward)
H = ForwardDiff.hessian(f, opt.minimizer)
integral = -opt.minimum + 0.5 * (log((2π)^N) - logdet(H))

@jkbest and I already have working implementations here and here. The trickier parts are making sure the optimization is fast, and the Hessian is automatically sparse if it can be.

  • Sparse Cholesky decomposition in Julia is done via SuiteSparse/CHOLMOD, which I think is the same as TMB. This PR should let those decompositions be used more easily for AD and Laplace approximations after the next Julia release (see also this issue).

Basically, I agree with @jkbest that a TMB equivalent in Julia is very much within reach. My work-in-progress MarginalLogDensities package can already take an arbitrary log-density function and integrate out an arbitrary subset of its arguments using the Laplace approximation with automatic sparsity detection (though that part is not optimal). See this simple demo for an example. It will just take some sustained effort from a couple/few knowledgeable people to fill in the missing bits and plug all the pieces together.

3 Likes