Uncertainty quantification of inverse problem using hierarchical Bayes: high-level modelling

Turing follows the common notation for a generative process to define a model. The basic idea of that is to write down how to generate an observation. Maybe be it helps to think in those terms to understand how to write a model in Turing.

For mixture model this means:

@model gmm(x, K) = begin
   # generate the parameters of each component / cluster, i.e. the location.
   m ~ MvNormal(zeros(K), ones(K))

   # generate the probabilities (weights) of generating an observation of the kth cluster.
   w ~ Dirichlet(K, 1.0)

   # for each observation
   for n in length(x)
      # decide from which cluster the observation is generated
      z ~ Categorical(w)

      # generate the observation from the selected cluster.
      x[n] ~ Normal(m[z])
   end
end

In equations you would define the same model as follows:
m_k \sim Normal(0.1, 1.0) \, , \; k = 1, \dots, K
w \sim Dirichlet(1/K, \dots, 1/K)
z_n | w \sim w \, , \; \forall n
x_n | z, m \sim Normal(m_{z_n}, 1.0) \, , \; \forall n

So basically, what that means is that each observation is generated from a Normal distribution given z. If you perform inference, Turing will therefore assume that the likelihood model for the observations is a Normal distribution.

In case of your example, Eq. 6 is the assumed likelihood model for the observations. So to perform inference, all you need to do is to provide a custom distribution with a logpdf function that computs the likelihood for you. In case you have missing data or want to generate data from the model you would also need to provide a rand function.

I hope that helps. If not let me know what part is not clear.