[ANN] SliceSampling.jl - Efficient zeroth-order MCMC algorithms

Hi all,

As part of the Turing ecosystem, we recently created a collection of slice-sampling MCMC algorithms:

Why slice sampling algorithms are great:

  • Don’t need gradients!
  • For low dimensional problems, they perform very well with minimal tuning
  • Even if the tuning is off, they automatically adjust the amount of computation and result in adequate samples.
  • They handle complex posterior geometry, such as multi modality, very well. This is because they don’t rely on gradient information.

We provide the following algorithms:

  • The classic univariate slice sampling algorithms [1]
  • Latent slice sampling[2]
  • Gibbsian polar slice sampling[3]

The package supports Turing and can be used as follows:

using Distributions
using Turing
using SliceSampling

@model function demo()
    s ~ InverseGamma(3, 3)
    m ~ Normal(0, sqrt(s))
end

sampler   = RandPermGibbs(SliceSteppingOut(2))
n_samples = 10000
model     = demo()
sample(model, externalsampler(sampler), n_samples; initial_params=[exp(1.0), 0.0])

There are also plans to make it officially part of Turing so that it can be used in combination with Turing.Gibbs, so stay tuned!


  1. Neal, R. M. (2003). Slice sampling. The annals of statistics, 31(3), 705-767. ↩︎

  2. Li, Y., & Walker, S. G. (2023). A latent slice sampling algorithm. Computational Statistics & Data Analysis, 179, 107652. ↩︎

  3. Schär, P., Habeck, M., & Rudolf, D. (2023, July). Gibbsian polar slice sampling. In International Conference on Machine Learning. ↩︎

12 Likes

Great work! I’m wondering does this tackle some of the problems that need reversible jump mcmc (i.e. transdimensional sampling)

Hi! Unfortunately, no. For RJMCMC problems, you need to use an RJMCMC sampler. However, I personally have been using slice sampling within RJMCMC samplers quite successfully.

1 Like