Sampling from posterior predictive distribution

I’m completely new to Bayesian Inference stuff, now I’m reading “Statistical Rethinking” book chap. 3. There is a concept of “posterior predictive distribution” when I take the “sum” of weighted distributions according to posterior distribution of those parameters.

I was able to make it for coin toss example using Mixture Model from Distributions:

mix = MixtureModel(map(p -> Binomial(9, p), 0 : 1/1000 : 1), posterior)

where posterior is the posterior probability mass function for parameter p.

I wonder whether it could be implemented more clearly using all these packages, like Turing or Gen or something else.

The recently merged PR in Turing allows you to do this concisely after inference has been performed using sampling (MCMC) or variational Bayes (VI).

So the answer is, yes this exists in Turing master. We will add some documentation to explain what the syntax is.

2 Likes

Yes! There’s an example in the Soss.jl README using the predictive function. Please let me know if you have any trouble with it

For the future: Rob’s statistical rethinking for Julia package could be of interest:
https://github.com/StatisticalRethinkingJulia/StatisticalRethinking.jl

5 Likes