Gibbs sampling with Gen.jl or best ppl for complicated model

I’m trying to implement a model, described here. It requires Langevin MC, spike slab priors and Gibbs steps. of the Julia PPLs, Gen.jl seems like the best fit overall, due to the flexibility of building your own sampling algorithm and the fact it comes with MALA/LMC.

Gen does not come with a Gibbs sampler “out of the box”. But since it is a special case of MH, I guess it’s possible to implement with a custom kernel? I don’t necessarily need a fully coded solution. But it would be nice to know if this is a realistic option in principle, before I invest more time.

Also, if anyone thinks there’s a more appropriate ppl to use, I’m open to it. But please check out the link first.

Thanks in advance.

1 Like

It may be best (and a great learning experience) to code the whole model outside of a ppl in the end. But (besides the obvious convenience) I like being able to cite cool Julia packages.

You can implement a Gibbs sampler in Gen if you happen to know the model structure well, and what the posterior distributions for each random variable should be! But because it’s not possible to do Gibbs sampling for arbitrary probabilistic programs (there aren’t always closed form solutions for the exact posterior of some variable conditioned upon all the others), we don’t provide an automated method for Gibbs sampling.

Looking at the model in the paper you linked to, it does seem like Gen.jl is uniquely well suited to reproducing that inference algorithm, in particular because it provides strong support for Reversible Jump MCMC using trace translators. But you will probably have to write a custom proposal distribution in order to perform Gibbs steps, and use that within Gen’s metropolis_hastings kernel. Here’s one of our examples of writing a Gibbs proposal for a discrete random variable in a Bayesian linear regression model where points might be outliers.

1 Like