I’m completely new to Bayesian Inference stuff, now I’m reading “Statistical Rethinking” book chap. 3. There is a concept of “posterior predictive distribution” when I take the “sum” of weighted distributions according to posterior distribution of those parameters.

I was able to make it for coin toss example using Mixture Model from Distributions:

```
mix = MixtureModel(map(p -> Binomial(9, p), 0 : 1/1000 : 1), posterior)
```

where `posterior`

is the posterior probability mass function for parameter p.

I wonder whether it could be implemented more clearly using all these packages, like Turing or Gen or something else.