[ANN] BarkerMCMC.jl - robust gradient-based MCMC

BarkerMCMC.jl

The package isn’t new, but it hasn’t been introduced here yet and might be beneficial for some.

BarkerMCMC.jl Implements the surprisingly simple, gradient based MCMC sampler proposed by Livingstone & Zanella (2021). The algorithm should be more “robust with respect to tuning parameters” compared to (vanilla) HMC.

We had hoped that it would also demonstrate increased robustness in the face of noisy gradients. However, at least in our application, this did not prove to be the case. If you have other ideas how to deal with this, please let me know :slight_smile:

Reference

Livingstone, S., Zanella, G., 2021. The Barker proposal: Combining robustness and efficiency in gradient-based MCMC. Journal of the Royal Statistical Society: Series B (Statistical Methodology). https://doi.org/10.1111/rssb.12482

7 Likes

The repo appears to be 2 years old. Is there a reason you waited this long with the announcement? How did you find the algorithm in practice for other problems?

Also, note that some packages are now using

to specify log densities, there is a related package for AD.

The announcement is late for two reasons: first I wanted to wait for the result of a student project looking into the performance (which unfortunately has never happened), and second, I simply forgot it…

Our motivation was that NUTS can get stuck when the gradient is noisy (which happened for example with models that use adaptive time-stepping ODE solvers). The hope was that because Barker MCMC uses the gradient only to “warp” the proposal distribution, it may be more robust. At the end we did not make a comprehensive comparison.

Having a LogDensityProblem interface would be neat. If there is enough interest I’m be happy to add it. For now opened an issue.

5 Likes